Abstract

Gait bout is when an individual performs certain physical activities such as walking or running. In the last few decades, the study of gait bout has led to substantial progress in treating gait impairment (neuropathic, myopathic, and parkinsonian) in a person. Recently, gait bout study has been improved by advancing smartphone technology. To perform gait bout tasks, two different human activity scenarios, such as walking upstairs and standing, are obtained using the axis orientation of a smartphone accelerometer. To capture the pattern of walking upstairs and standing, we utilize a smartphone device attached to the waist of 30 subjects within the age group from 19 to 48 years old. We propose a human activity recognition model known as the multivariate triple exponential weighted moving average of the martingale sequence using particle swarm optimization (MTMS(PSO)) in the experimental setup. MTMS(PSO) utilizes the martingale framework to capture gait bout in human activity recognition data. Firstly, MTMS(PSO) is an unsupervised learning method that uses smoothing techniques such as triple exponential smoothing to remove high-frequency noise from the processed activity times series, making the patterns more visible. Secondly, the activity recognition model involves computing a threshold for identifying gait bout. Thirdly, MTMS(PSO) uses logical precedent and particle swarm optimization to enhance accuracy and precision. As a result, the overall MTMS(PSO) accuracy and G-mean are and , respectively. In addition, MTMS(PSO) technique independently outperforms other traditional methods such as MRPM(PSO), MGM(PSO), and ELM.

1. Introduction

Ageing is a process that causes physical or physiological decline that could affect people’s quality of life, resulting in injuries, decreased mental health, or reduction in physical activity. Human activity recognition (HAR) is an important concept in conventional computing as it applies to actual human condition challenges associated with eldercare and medicare. HAR is a research area that can track gait bout (GB) of an individual through the collection of context information about a user’s condition and environment [1]. In this instance, a gait can be the pattern of walking, while GB can be defined as the period of continuous movement (walking or running) [2, 3]. In this case, GB is the time interval between gait initiation and termination. GB essentially requires repetition of stance-swing cycles [4]. Subsequently, if the foot is on the ground for more than the specified threshold period, it can be considered static, therefore non-GB.

Some research has been successful in discovering GB in HAR time series [5]. However, it is unclear how long an individual can pause movement (walking) within a GB before determining it as a separate bout. A pulse of within GB can potentially alter the amount of GB and its duration included in the analysis [3]. Also, detecting composite gait bout still constitutes some challenges, such as identifying simultaneous movements. For example, an adult with gait impairment (neuropathic, myopathic, and parkinsonian) can be walking toward the kitchen while trying to answer a phone call. These movements are very complex, making sequential analysis complicated. In addition, gait bout performed by an individual could be interwoven; for instance, a person can be distracted when there is a knock on the door while going to the restroom. After attending to the person at the door, the first person can walk back to the sitting room. Also, there may be ambiguities defining similar activities performed by the elderly. For example, “opening the cupboard” might represent “preparing food” or “tidying.”

GB analysis could help understand the impact and challenges of human movement restriction or disability by identifying transitions and estimating the duration of physical activity. This information can be beneficial for investigating and developing new systems that can identify movement limitations. The world population is ageing, giving rise to neurological impairment and inactive conditions (arthritis, Parkinson’s disease) that can impede physical activity [6, 7]. The risk associated with these disorders represents a significant challenge to medical practitioners. Therefore, as mentioned earlier, quantitative activity monitoring can help to discover and investigate physical movement [8, 9]. The essential information about the focus and motivation of our work is summarized as follows: (i)The purpose of studying and monitoring GB in activity recognition is, for example, to detect movement patterns and early signs of physical impairment experienced by an individual. Furthermore, this process could allow clinician(s) to assess movement progress made by people (particularly those with physical impairment) when testing for the solution’s efficacy [10, 11].(ii)The study of GB could help explore and investigate the effect of free-living environments on people with gait disorder [12]. In addition, studying GB could allow medical practitioners to identify this walking disorder from the onset(iii)Studying GB could enable the tracking of disease progression and test the effectiveness of preventive measures and solutions which can help improve movement irregularities(iv)The study of GB could also enable clinician(s) to monitor and assist people with gait abnormalities as they may demonstrate unusual movements, which include symmetry of upper and lower limb swinging and other normal joint kinematics [1315].(v)By studying and monitoring the gait pattern of people with gait disabilities, proper movement modifications can be advised to enhance their moving style, and long-term well-being [16].(vi)The study of GB using human activity data obtained from smartphones (see subsequent paragraphs) can offer low-power, low-cost, continuous remote screening tools for pathology identification that would enable adjustment to individual needs for reducing the burden on clinicians and medical practitioners

The principal objective of this work is to devise ways by which we can detect GB through the use of wireless devices (such as smartphones) to monitor people with movement abnormalities to enable them to live a healthy lifestyle as they perform daily activities. Furthermore, wearable technology such as accelerometers and gyroscope sensors can analyze GB in real-life situations during human movement [12, 17]. Accelerometer sensors estimate the displacement of a mass using a location measuring circuit. That estimation is then transformed into a digital electrical signal for data processing via an analog to digital converter (ADC). On the other hand, a gyroscope sensor measures and maintains orientation and angular velocity. HAR monitoring using these sensors can characterize human movement (e.g., walking) given a set of observations. This process can be achieved by monitoring and analyzing walking information acquired from various sources such as the environment and sensors [18, 19].

Sensors such as accelerometers and gyroscopes are encapsulated in smartphone devices and can be attached to designated segments of the body such as the wrists, waist, chest, and thighs [20, 21]. The study of HAR, through accelerometers and gyroscope sensors, produces crucial information about individual daily movement and lifestyle [22]. Smartphone usage increases tracking GB due to high device processing power with communication handling and sensors such as accelerometers and gyroscope [23]. As previously discussed, accelerometers and gyroscopes in smartphones can reflect the duration of human movement through the measure of velocity and displacement [24]. Smartphones are portable and require no complex architecture to use them. This characteristic enables a smartphone to acquire GB information for HAR [23, 24]. Smartphones (with built-in inertial sensors such as gyroscope and accelerometer) are prevalent nowadays as it makes information and communication services available for individuals as they perform their daily activities. However, sensors embedded in smartphones have drawbacks as a result of random zero bias, and oscillation noise which affect the reading outcome [25].

This paper proposes a heuristic thresholding method called the multivariate triple exponential weighted moving average of the martingale sequence (MTMS) based on previous work [26]. This method can identify GB(s) and estimate their duration using a smartphone attached to the wrist in unsupervised real-world situations. The novelty of this study is as follows: (i)Our method combines the martingale theory with the triple exponential moving average (TEMA) in a novel way to find a solution to the problem of GB detection for HAR time series(ii)Unlike many GB detection algorithms that need a sliding window [27, 28], where the window length could be chosen based on the accuracy of the methods, the MTMS algorithm does not require sliding windows. Therefore, the approach does not need to observe the model accuracy(iii)Our approach uses optimized parameters to obtain the best performance for detecting GB in HAR sequences

To improve the accuracy and precision of MTMS method, we utilize the concept of optimization. The optimization process of tuning the input parameters is to discover the most efficient parameter value that improves the algorithm’s performance [2933]. An example of such optimization techniques is the particle swarm optimization (PSO). PSO [34] (discussed further in Section 3.3) is applied to optimize the parameters of our MTMS method for improved performance. Optimization methods such as genetic algorithm (GA) [35] can be computationally expensive and demands many iterations. However, in this case, PSO is the best alternative as it can handle complex problems and requires a small number of parameters with a correspondingly lower number of iterations [36, 37]. For our proposed approach, we implement PSO to maximize the G-mean. The reason why we use G-mean instead of F1 as our preferred metric is that it can be used to compute the stability between the classification achievement on the positive and negative classes [38]. G-mean computation can prevent overfitting the negative class and underfitting the positive class [39] when compared to F1. To also handle the challenges of distorted GB points, we implement precedence rules [40]. These precedence rules determine the grouping of bouts in the HAR data set and deal with the challenges of alignment and distortion of bout points. The optimization process and the implementation of the precedence rule are further discussed in Section 3. We benchmark our proposed technique against traditional methods (discussed in Section 3.4) for performance comparison purposes.

The paper structure is as follows: In Section 2, we review the latest work done on GB detection and discuss the stages of the algorithm; in Section 3, we introduce our proposed approach; in Section 4, we show our experimental results and compare them with the existing methods; in Section 5, we link and discuss the result obtained in Section 4 to the healthcare challenges stipulated in Section 1. We summarize and conclude the paper in Section 6, providing insight into the outcomes and the next steps of the research.

In the last few decades, many gait bout detection techniques have been proposed for human activity recognition; as a result, recent research is now focused on measuring gait speed in clinical areas. For instance, the study of biometric gait pattern classification using an extreme learning machine (ELM) approach can be implemented to detect early gait abnormalities such as brain or neurological disorders in individuals. These abnormalities cannot be discovered using visual observation alone but also by implementing a robust quantitative analysis of the movement of an individual. This procedure can assist in comprehending the neuro-muscular mechanics related to brain disorders, and this motivated Patil et al. [41] to evaluate the performance of multi-class gait classification using several machine learning methods, namely, KNN, SVM, ELM, and MLP. Experimental results showed that the ELM gave good results ( overall classifications) when used to analyze the neuromuscular mechanics of patients with multiple sclerosis and stroke. However, the model is a classification based supervised learning that can be complex for GB detection.

The acquired push recovery capability for anyone is centred on learning, and the learning mechanism is not defined to us. Different models based on conventional mechanics and controls have been created to explore this mechanism. Nevertheless, these models have limitations. Semwal et al. [42] believe that an efficient computational model centred on learning will be effective in addressing these restrictions and proposed a model that collates humanoid push recovery data by executing the concept of utilizing accelerometer sensors in the smartphone. Experimentation is performed using the proposed HMCD and HLPRDCD solutions to demonstrate knee, hip, and ankle joint angle changes to analyze smartphone data. Results show that smartphone data collection is more accurate than a potentiometer based on HMCD. Also, executing LVQ shows that push recovery capability depends on age, height, weight, sex, race, ambidextrous, etc. The limitation of the model is that it does not implement optimized computational based on hybrid automata that can coordinate biped robot push recovery the same way as in humans.

Gait study is vital to discover a person from afar. However, the major issues associated with human gait-based discovery are high variability, movement obstruction, pose and speed variance, and regular gait cycle detection. These challenges motivate Semwal et al. [43] to develop an algorithm that explores the CASIA A, B, and C data sets to identify view, clothing, and speed invariant human detection. The study comprises a robust approach that utilizes computer vision for human identification. The suggested technique consists of feature extraction procedures. Feature extraction approaches consist of gait energy image (GEI) for cloth invariance, histogram of gradients (HOG) for multiview invariance, and Zernike moment with random transform for cross-view invariance. The following methods, namely, SVM, ANN, and XGBoost-based approaches, are implemented on the CASIA data set. The algorithms achieved , , and detection accuracy independently for the three scenarios of invariance performed such as speed, cloth, and pose. However, the model utilizes some supervised learning methods that can be complex to develop for real-time GB detection and require a lot of computational time.

Most approaches for analyzing HAR involve a robust feature extraction application and time series pre-processing. However, this can entail a lot of human effort that is time-consuming and application-specific. This situation inspired Dua et al. [44] to propose a deep neural network-based model that implemented a convolutional neural network and gated recurrent unit as an end-to-end model. This model executes automatic feature extraction and categorizing of the HAR scenarios. Experiments were carried out utilizing the wearable sensor’s raw data with nominal pre-processing and non-handcrafted feature extraction techniques. The accuracies acquired from the analysis of the UCI-HAR, WISDM, and PAMAP2 data sets were , , and independently. The results show that methods outperform some other related approaches. However, the model’s technique implements a supervised learning approach that can be complicated to develop for real-time GB detection and requires a lot of computational time.

Atrsaei et al. [40] suggested an approach based on a single sensor setup. This technique can develop and validate clinical and home areas by attaching a sensor to the lower back. The algorithm uses a method known as naive Bayes classifier (a family of probabilistic classifiers that is centred on applying Bayes’ theorem with robust (naive) independence assumptions between features) that can be utilized to discover gait bouts in activities around the home environment. The approach is validated using the data accumulated from patients with movement disabilities such as multiple sclerosis. The method achieved a precision of for evaluating gait speed with a bias rate of zero. However, the performance is still limited to noise interference from accumulated data obtained via wireless sensors. The technique is capable of assessing unsupervised mobility is reflected in its accuracy of and F1 score of .

Gait bout study motivated Barrett et al. [45], who suggested a way to improve the evaluation of gait bouts in Fitbit devices. This is achieved by evaluating gait bouts by developing the modelled physical activity level from the Fitbit Flex. The paper’s primary purpose is to contrast the “gold standard” ActiGraph to that of the modelled Fitbit Freedson approach and ascertain regular values of expected errors in gait bout identification between two devices and methods. The techniques are proxy methods (the proxy method is a structural design pattern that makes it possible to allow for the replacement of another object) to estimate the actual physical activity levels. The approach uses three techniques to compare bout detection. These techniques are the ActiGraph Freedson method, Fitbit Intensity Score, and the modelled Fitbit Freedson utilizing three varying outcomes. Firstly, the author compares the duration of gait bout achieved from the AntiGraph GT3X technique to that of the baseline of each subject performing physical activity in a day. Secondly, the same procedure determines the intensity score achieved by the Fitbit and the modelled Fitbit Freedson approaches. Lastly, the author contrasted the gait bout discovered from the three methods to a labelled gait bout recorded in a self-report diary for performance evaluation. This process is still restricted by noise, impacting the algorithm’s overall performance.

GB detection can also be performed using a wrist-mounted sensor measurement unit. However, the extensive freedom of wrist motion during daily life situations is a severe obstacle to a robust and precise GB analysis. These challenges motivated Soltani et al. [46] to suggest an approach for identifying GB using a wrist-mounted accelerometer. The method uses a Bayes estimator (Bayes estimator is a decision rule that minimizes the posterior expected value of a loss function), least absolute shrinkage, and selection operator (LASSO) (which identifies the optimal possible features to maximize performance on the training data set). The LASSO can select characteristics that include all biomechanical benchmarks (intensity, posture, periodicity, and non-gait dynamicity). The Bayes estimator and LASSO techniques are followed by two physically significant post-categorization steps to handle the problems of movement challenges of the wrist in real-life situations. The proposed approach has been validated against two data sets consisting of healthy young and older people, respectively. The algorithm achieved a satisfactory interquartile range within to for accuracy, sensitivity, precision, and F1 score in the identification of GB. The algorithm also produces a high correlation of between the proposed and reference methods for the total duration of GB discovered. This correlation can be further improved by isolating noise in the HAR sequence.

Ramakrishnan et al. [47] proposed a method, namely, gait asymmetry metric (CGAM), that synthesizes human gait motion. CGAM is weighted by normalizing the data to stabilize each parameter’s effect and combines spatial, kinematic, and temporal disproportional parameters. The approach can enhance the performance of gait patterns by assisting mechanized recovery approaches. CGAM also computes the quantifiable thresholds to produce efficient, comprehensive equivalent gait asymmetry. The study for the techniques combines clinical estimation, such as a six-minute walk test (6MWT), timed up and go (TUG), and gait velocity. These combinations are implemented on gait data obtained on individuals with movement disabilities such as stroke before and after recovery. Experimental results show that the CGAM can produce a higher correlation in estimating GB. However, this correlation is still restricted to noise interference in the HAR data set.

The concept of human activity recognition is becoming relevant in the healthcare domain for observing and monitoring disabilities associated with movement. However, some HAR analysis models can identify changes in the HAR data stream but cannot measure the change intensity and duration. This situation motivated Etumusei et al. [48] to suggest an unsupervised learning technique, namely, multivariate exponential weighted moving average of the martingale sequence using genetic algorithm (MEWMS(GA)). The method can discover change duration and intensity by applying the martingale framework in HAR data sets. The model also uses optimization techniques to obtain the optimal parameter value in the weighted average by executing a genetic algorithm. Experimental outcomes show that the proposed approach improves over current martingale techniques. Furthermore, the method does not focus on estimating gait bout(s).

Ho and Wechsler [26] recommended a martingale framework based on the testing for exchangeability property of data values to discover changes in time series. The method comprises a clustering approach and the implementation of a metric called strangeness, which shows how data points differ. Consequently, the strangeness was used to compute the -value and the randomized power martingale (RPM). This method can detect a transition in a data sequence from the experiment performed. However, there are restrictions to the algorithm's performance. For example, the technique can capture some false positives due to noise. Also, the method can analyze unsupervised multivariate data sequences.

Consequently, in some of our previous work [4850], we proposed several robust thresholding methods that can analyze time series to detect changes. These methods can be categorized into two stages. Firstly, the methods use smoothing techniques such as a moving median or a Gaussian moving average to isolate and minimize the impact of noise in both univariate and multivariate time series. Secondly, the approaches use optimization methods such as GA and PSO (see Section 1). However, the technique was not designed for GB(s) detection.

2.1. Background of Research

As we can deduce from previous paragraphs, current algorithms can discover GB in HAR data sets acquired from wireless sensor technologies. However, the performance of these approaches is still limited by noise interference which can influence result rendering. Also, some of these approaches are not optimized for maximum performance. Furthermore, some of these methods use a supervised learning approach to detect changes in a HAR data set. However, the supervised learning approach has the following restrictions: it can be complex and expensive to implement, takes a lot of computation time, and cannot be used in real-time. Moreover, it might be challenging to use with dynamic and growing data [51].

To address the limitations stated previously, we present an unsupervised learning model [52] known as a multivariate triple exponential weighted moving average of the martingale sequence (MTMS) method. Further elaboration of the research novelty outlined in Section 1 includes: (i)The MTMS model is straightforward and not complicated. It does not take much computation time(ii)Our model can be implemented on dynamic and growing data without restriction(iii)The MTMS approach can identify GB(s) and estimate their duration by analyzing HAR data obtained using a smartphone attached to the wrist in unsupervised real-world situations

As previously explained in Section 1, the MTMS approach uses the martingale framework and the triple exponential moving average (TEMA) as a smoothing factor [53, 54]. The triple exponential moving average can smooth time series fluctuations, making it easier to discover patterns without the lag related to traditional moving averages (MA) [55, 56]. Informally, it does this by obtaining several exponential moving averages (EMA) of the actual EMA and subtracting some of the lag. TEMA can help to identify trend order and short-term signal alteration. TEMA is more suitable compared to the EMA for this type of analysis because it reacts more adequately to trend variation in time series [54, 57]. In addition, we can apply an optimization algorithm (PSO), which is further explained in Section 3.3. PSO gives the optimal parameter value for enhancing the performance of the proposed algorithm.

2.2. Applicability of the Suggested Approach

The proposed approach can be translated into a real-time healthcare system that can monitor an individual’s GB(s) during physical activities. This system can be in the form of a mobile app that can process and analyze accelerometer data set for GB discovery. Our proposed system or application can log a subject’s movement throughout the day. This application can also help observe patients with movement disabilities (especially the elderly) or older adults living alone. Furthermore, clinicians can use the information from this health application to provide adaptive healthcare services to users. The relevance of this smartphone system is outlined below: (i)It is cost-effective as it does not require additional hardware or sensors(ii)It is efficient in mobility as it can be placed in the pocket(iii)It is non-obtrusive and does not make a subject feel awkward or uncomfortable(iv)It does not require supervision as smartphones can be with an individual daily(v)A smartphone is likely to be with the users during their daily activities. This situation makes our health application effective in monitoring the movement of an individual

Our proposed system involves five distinct stages: the pre-processing of the HAR data set obtained using a smartphone, the analysis of the HAR data sequence using our recommended algorithm, the recognition and the discovery of GB, and finally, the re-initialization of the system.

3. Human Activity Recognition Model

This section explains the proposed approach for analyzing the HAR data set. Our recommended technique aims to identify GB using an accelerometer embedded in a smartphone attached to the wrist in unsupervised real-world situations. In the following paragraphs, we shall discuss the HAR data, pre-processing approach, the martingale concept, threshold computation, logical preferences, particle swarm optimization technique, and applicability of our recommended model.

3.1. HAR Data Set

The HAR data is accessed publicly from the UCI machine learning repository and through the research work of Anguita et al. [1, 58]. The data is then adapted for only the stand to walk of stairs scenarios. The HAR sequence acquired from smartphone devices attached to different participants consisted of volunteers ranging from to years old. Each volunteer performed the experiment protocol twice, and each scenario was at least executed twice on each test to imitate repetition (see Table 1). A timeout of between and in which the subject remains still (stand) was arranged to differentiate between each walk upstairs scenario. This data set is obtained from an accelerometer embedded in a smartphone. The smartphone is mounted in a suitable part of the body (preferably the waist) of the participant to make it possible to monitor or record gait scenarios performed by individuals [1].

A smartphone is recommended for the experiment as there are many limitations to using mobile devices in an unconstrained environment. This limitation includes variation in mobile hardware architecture, low memory capacity, non-robust operating system, embedded sensor quality, and periodicity of acquired data. These constraints make the experiment HAR difficult. However, smartphones can overcome some of these challenges. Moreover, the signals received using on-body sensors such as smartphones are possibly beneficial over signals obtained by video cameras due to the reasons outlined: (i)Smartphones can mitigate the limitations of environmental constraints and stationary settings that cameras often encounter [59](ii)The signal information obtained from on-body sensors such as smartphones are accurate, efficient, and effective [59](iii)Smartphones enjoy the advantages of information privacy in contrast to that acquired using video or camera [59]

The Samsung 19100 Gallaxy S II has of storage (microSDHC), of RAM, and a dual-core 1.2 Hz cortex - A9 hard drive. These features make it the ideal device for the experiment. The element of these movements comprises triaxial acceleration and action label using video recording. Each participant executes the scenario twice using a smartphone (Samsung 19100 Galaxy S II) attached to the waist (the most prominent smartphone placement for effective accelerometer orientation capture is either the human waist or thigh for the walking upstairs scenario [60]) of the volunteers.

Consequently, smartphones have inserted a triaxial accelerometer that can estimate the subject’s acceleration. Figure 1 shows the axis orientation of the inertial sensor of the waist-mounted smartphones executed for the experiments and its casing. Furthermore, Figure 2 shows the orientation of the smartphone attached to the waist. The x-axis estimates the vertical movement, while the y-axis measures the horizontal movement in the lateral direction. Furthermore, the z-axis estimates the action in the posterior-anterior direction. The Acceleration signal was recorded at a constant rate of , which is moderately fast for obtaining human body movement information. [61]. The labelling procedure was achieved manually by choosing the videos recorded during the experiments as ground surveillance and contrasting them with the log files of the inertial signals [1].

Our goal is to identify GB (the continuity of the same movement patterns) in the experiment. As mentioned earlier, GB is the time interval between gait initiation and termination in this case. GB essentially requires repetition of stance-swing cycles [4]. Subsequently, if the foot is on the ground for more than a certain threshold period, it can be considered static, therefore non-GB. The participants perform several actions, such as the stand and walk-up-stairs dynamic scenario. The walking-up-stairs scenario is represented as GB or mobility, while standing is classified as non-mobility or non-GB. .

These actions are implemented several times in the data set [40]. Table 1 represents the scenario discussed.

The encapsulated accelerometer in the smartphone is used to capture triaxial linear acceleration and triaxial angular velocity at a consistent rate of which is sufficiently fast for obtaining human body motion information [61]. The multivariate triaxial data accumulated from these activities have been labelled manually utilizing video recording. In addition, the acquired data set has been randomly divided into two sets, where of the participants were chosen for creating the training data and of participants for developing the test data. The randomization of the partition is to make sure that no samples were obtained from the same user in both subsets.

3.2. The Pre-Processing Approach

The sensor signals, acquired through embedded accelerometers in smartphones, are pre-processed by employing various filters [1]. Firstly, the median filter and a low Butterworth filter with a threshold frequency of Hz are included to reduce noise in the signal. The Butterworth filter is a signal processing filter devised to procure a response frequency that is as flat as possible in the passband [63]. The threshold frequency was chosen according to work presented in [61] which stipulates that the energy spectrum of the human movement lies within the range of and . From these procedures, a triaxial total acceleration was produced. The clean signal is expressed as the total of the two acceleration vectors given as the gravitational component and the body motion acceleration was separated by utilizing another low-pass filter (assuming that the gravitational element only influences the lowest frequencies). The experiment in separating the two signals shows that was the maximal baseline frequency to attain a constant gravity . This result was achieved by varying the baseline frequency from to using an increment of 0.025 Hz and measuring the minimum square error of the filtered gravity signal by subtracting the standard gravity constant (). Consequently, the acceleration time derivative (), otherwise known as Jerk, was estimated. Consequently, the signal is sampled using a fixed-width sliding window of about and an overlap of equivalent to readings per window. This overlap is effective in executing other HAR methods, such as in work stipulated in [64, 65].

The fixed-width sliding window and overlap were chosen for the following reasons [1]: (i)The cadence range of a normal individual walking is between and steps per minute [66] which represents a minimum speed of 1.5 steps per second(ii)A minimum walking cycle consists of two practical steps per window sample(iii)A minimum speed of is chosen as the average human cadence so that individuals with slower cadence due to disability or age are not excluded from the experiment(iv)Frequency domain signals demand the fast Fourier transform (FFT) [67] that maximize the power of two vectors ( × =128)

Furthermore, for each window, a vector of features (accelerometer 3-axial raw signals (tAcc-XYZ) was acquired by computing variables using the time and frequency domain. This feature extraction, namely, signal magnitude area (SMA), mean, standard deviation (STD), entropy, and signal-pair correlation (Corr), was implemented in previous work [1]. Subsequently, the FFT was utilized to locate the frequency elements for each window. Finally, a normalization technique is used to remove data redundancy [68, 69].

The labelled normalized vector of features standing and walking upstairs scenarios are illustrated in Figure 3 independently. Each plot in Figure 3 represents the accelerometer 3-axial signals X, Y, and Z, separately. The red lines in the plots separate the areas that show the participant walking up the stairs at different time intervals in a dynamic HAR scenario. The rest of the data represents some inactive activity when the participant is only standing. Finally, these patterns are analyzed using an unsupervised learning approach (see Section 3.3) for the discovery of GB. The HAR process is illustrated in Figure 4.

3.3. GB Detection Model

The pre-processed data is analyzed using the martingale framework, which originated from probability theory and was initially observed in gambling. Moreover, the idea has been utilized in diverse sectors such as in finance (for financial asset pricing) [70]. In addition, the martingale theory has been used in domains such as survival analysis, decision-making, and investment optimization [71]. In [49, 50], the research outcomes demonstrate that the martingale idea can be applied to the statistical analysis of time series such as terrestrial data and HAR. These outcomes show the martingale methodology’s practicability and efficacy in discovering changes in the data-producing model for time series data streams. Ho and Wechsler [26] suggested a martingale approach, namely, randomized power martingale (RPM). This method can obtain better precision and recall than the traditional sequential probability ratio rate for change identification. We apply the martingale procedure through a heuristic thresholding method to analyze HAR data sets for GB detection. To improve the result’s precision and recall, we use several techniques such as TES, logical preferences, and PSO optimization techniques. The advantage of the proposed algorithm is that it can obtain better precision or recall and does not use sliding windows like the RPM method in GB detection.

The profile of the data analysis involves the following steps: (i)Compute the martingale point(ii)Compute the MTMS point (see Section 3.3)(iii)Compute the threshold (see Section 3.3)(iv)Determine the GB(s) that occurs in HAR(v)Implement logical preferences to improve GB(s) discovery (see Section 3.3)(vi)Optimize the algorithm’s parameters to improve its accuracy and precision further (see Section 3.3). We shall further explain this procedure in the following section

3.3.1. Randomized Power Martingale

Ho and Wechsler [26] suggested an extension to the martingale approach by proposing a metric known as strangeness. Strangeness represents to what extent a new data point differs from the previous one in a time series.

Let us consider a series , where there is a newly registered point . Let us also examine the situation where the data points in have been clustered into disjoint sets [72].

Definition 1. The strangeness of is defined as where is the midpoint of the cluster , for some such that . represents the selected distance.

The strangeness of is used to compute a “probability” time series where its points are called . If for , is the strangeness of and is a fixed value in [26, 73], then is computed as follows: where “” is a function that counts the number of satisfying the following condition. For instance, is the number of satisfying , where is the strangeness estimation stated in equation (1).

Intuitively, measures the probability of being more strange than . Thus, the set of can be utilized to compute a new sequence known as the randomized power martingale.

Definition 2 (see [26]). The randomized power martingale (RPM) is enumerated by determined at each time point as Therefore, for a fixed , we can compute . This model will discover GB when where is the threshold. The threshold will be explained in successive paragraphs.

Let us consider a data sequence , where each point is a vector, and is the number of variables of the study. will be computed for each variable at any time point. The next step involves reducing this new multidimensional sequence into a single metric. The is computed for each variable at a given time point. The -th mean of these values is determined using the equation: where is the -th variable of .

, known as a multivariate randomized power martingale (MRPM), will be our new point for analysis in the multivariate HAR data set. The following section explains the computation of the threshold (). In the next section, we introduce a method that improves the performance of the previously described martingale approach. The proposed method can estimate GB in HAR.

3.3.2. Multivariate Triple Exponential Weighted Moving Average of the Martingale Sequence (MTMS)

In previous work, Alevisakos et al. [74] use time-varying and asymptotic control limits to examine and explore the mathematical properties of the triple exponential weighted moving average (TEWMA) or triple exponential smoothing (TES) chart. The TEWMA chart is more effective when detecting small shifts in the process mean compared to the double exponential weighted moving average (DEWMA) and EWMA charts, respectively. Some instances of TES implementation can be found in the work of Ongiri et al. [75], who use TEM to remove high-frequency noise in hydrological data to make a vital prediction on water demand. Also, Dev et al. [55] suggested a triple exponential smoothing based forecasting methodology that isolates high-frequency noise from time-varying data of estimated solar irradiance to make the time series pattern feasible to make a forecast of up to .

As discussed earlier, this previous work motivates us to propose a thresholding method that implements TES on the martingale framework to discover GB in HAR time series.

Triple exponential smoothing (TES) extends exponential smoothing to support time series patterns that replicate at every element, where can be any number other than one [76]. To obtain the new point of the sequence, we use TES to define the exponential weights over time [77]. Finally, TES is applied to signal analysis to remove high frequencies encountered in the signal [77]. Suppose we generate a sequence of observations at a given point with size , where . EWMA is given as: where is the measure of the succeeding value of . is the smoothing factor (SF) in . The SF represents the weighting applied to the most recent observation. The higher the SF, the more weight is placed on the current observations, and the less weight is placed on previous observations.

Given a data sequence with replicated pattern cycle variation (L). is a positive integer that delegates the number of preceding samples. It also refers to the number of data points after which a new season begins. Consequently, TES [7780] is given as: where is the measure of at point . Note that is associated with the level of the series, is associated with the trend, and is associated with the repeated pattern factors [77]. For our technique, we express the relation of the smoothing factors as and . For a given time, , is the cycle of observation, and represents the expected proportion of the trend that was predicted [55]. To initialize a set of reproduced pattern factors, a minimum of 2L periods is required, which expresses two entire seasons of historical data. is the counterbalance into the index of the seasonal element against the last set from the observed time series point. Consequently, we apply the TES on the martingale point to obtain the MTMS points. The MTMS point is represented as follows: where is the martingale sequence at a given point with size and . is the estimator of the succeeding martingale value (). The following section discusses GB detection by presenting a thresholding method using MTMS points.

3.3.3. Threshold Computation

Ley et al. [81] suggested a technique that is defined as , where denotes the mean of the data points and the the median absolute deviation. We compute the threshold using , where denotes the median of our time series window. This threshold will enable the discovery of GB in the HAR data set. The condition for GB detection is

MTMS can determine GB in the data set. If MTMS is more significant than the threshold , GB is detected, and if MTMS point is lesser than the threshold, GB has not taken place. The interval at which the MTMS point is greater than is considered mobility (walking upstairs), while the contrary is referred to as non-mobility (standing). In the following section, we shall implement a technique known as logical precedent on the martingale sequence to improve the algorithm effectiveness in bout detection.

3.3.4. Logical Precedent

To improve the precision of the GB discovery, we apply two basic rules [40] as follows: (i)The non-mobility intervals that are less than and are between two mobility periods are translated into mobility(ii)The mobility intervals of less than are translated into non-mobility periods

The neuro-psychological reasons for the two rules are that GB less than 3 seconds are presumed not to be exact GB. In contrast, a non-mobility interval of fewer than 3 seconds connecting two mobility intervals can be acknowledged as a brief, inconsequential recess [40]. Consequently, any gait bout of and above is considered a real walking bout [82]. This understanding will enable us to potentially isolate unwanted GB detection due to noise or unwanted readings captured by sensors in the HAR time series. The following section will explain the optimization technique used to improve the algorithm’s parameters for enhanced performance.

3.3.5. Particle Swarm Optimization

Particle swarm optimization (PSO) is a robust stochastic optimization method that uses fewer parameters to solve complex problems [83, 84]. PSO combines the features of both GA and evolution methods. This combination makes the PSO technique computationally inexpensive in memory consumption and speed. In addition, PSO can manage ongoing optimization problems, where each population is assigned an arbitrary velocity that pushes them through the solution hyperspace. The PSO optimization aim is to locate the optimum function that either maximizes the fitness function or minimizes the loss [36, 85]. PSO involves the following [85, 86]: (i)For every iteration, measure the fitness value for each particle(ii)If the fitness value of the particle is less than the Pbest (personal best location discovered by each particle), then the Pbest is updated with the particle’s location as it is the current best achievable result at the point in time(iii)Identify the particle with the best fitness value from the overall particles as the Gbest (the best global position found by the swarm)(iv)After discovering the Pbest and Gbest values, the particle’s velocity and position are updated. This process is represented below:where , = velocity of the particle (in our case, velocity is the mechanism utilized to shift the particle (martingale points) so as to perform a search for optimal solutions) [87], where , = position of the particle, = personal best of the particle, and are random numbers, and and are the acceleration coefficient that determines the amount of the influences on the particles velocity in the path of the global and local optima [88]. is the inertia weight which ascertain the contribution rate of the particle’s preceding velocity to that at the present time point [89]. In the next session, we shall implement the PSO using G-mean metrics.

We use the PSO approach to identify the maximal parameter using G-mean metrics. The PSO utilizes the fitness function (F) to acquire the optimal parameter value of a method. In this case, the fitness function is the maximum G-mean value computed with a specified range of parameters. The fitness function is given as:

It is represented as follows: where and range from 0 to 1 for each activity [29]. As we have discussed earlier, PSO uses equation (11), which is the fitness function, to locate the maximum G-mean exploring within the parameter interval values [29]. PSO parameters are chosen to maximize the fitness function and can be seen in Table 2.

The new optimized sequence is known as MTMS(PSO) points. The MTMS method is illustrated in Algorithm 1, which shows the step-by-step implementation of the proposed approach.

Data: Input (F): HAR multivariate data set
Result: Output: MTMS points
1: Initialize: ; ;
2: Set values for cluster group , value, Set the period of historical data (at least 2L periods) for training the TES model;
3: while do
4:  A new example of is discovered;
5:  if= { null } then
6:   Set the strangeness of
7:  else
8:   Compute the strangeness of and the data points in
9:   Compute of ;
10:   Compute using (3);
11:   Translate the dimension into single metrics Compute points;
12:   Compute points;
13:   Compute MTMS points
14:   Compute threshold using MTMS points;
15:  end
16:  ifthen
17:   GB discovered
18:  else
19:   Add into ;
20:  end
21:  ifthen;
22:  end
23: end
3.4. Baseline Methods

In this section, we will elaborate and discuss the previous heuristic thresholding methods (multivariate randomized power martingale and multivariate geometric moving average martingale) [50, 71]. These methods have been adapted for GB discovery. We selected these baseline methods to implement the martingale framework to detect transitions in the data set. These methods can be compared with our proposed MTMS techniques to evaluate the algorithm’s performance within the martingale family.

3.4.1. Multivariate Randomized Power Martingale

As discussed previously, we adapt the RPM method to discover GB in a multivariate HAR data set. The following step is taken to compute the MRPM(PSO) approach: (i)Compute the RPM points for the vectors to give a new multivariate sequence(ii)Translate the new computed multivariate sequence to a single metric(iii)Compute the threshold of the series(iv)Optimize the parameter of the algorithm using PSO

3.4.2. Multivariate Geometric Moving Average Martingale (MGM)

Apart from MRPM(PSO), we consider another technique that is based on GB detection called geometric moving average Martingale (GMAM) [71]. The GMAM method makes use of SF described in Section 3.3. Consequently, we adopt this method to analyze multivariate time series, namely, MGM. The MGM algorithm steps are shown below: (i)Compute the GMAM points of the multivariate HAR data set(ii)Translate the vectors into single metrics by finding the average of the GMAM vectors(iii)Compute the threshold of the series(iv)Optimize the parameters using the PSO technique

3.5. Evaluation Metrics

This section explains the measure of the performance of our method using popular evaluation metrics [90]. These metrics are evaluated using a confusion matrix (CM) [91]. The CM depicts the predicted class of the activity (GB and non-GB). Consecutively, the CM is used to determine the accuracy, precision, recall, harmonic mean , and G-mean in terms of GB detection [40]. The accuracy, recall, precision, specificity, sensitivity, and G-mean matric can analyze the optimal selection for MTMS approaches. In this case, the confusion matrix (CM) can evaluate the performance of the algorithm [91].

Accuracy [92] is an intuitive performance metric defined as the ratio of GB(s) correctly detected in HAR to the total observations. The following gives accuracy:

We define true negatives (TN) as the false GB(s) correctly identified as false. True positives (TP) are the actual GB(s) that are correctly identified, while false positives (FP) are incorrect GB(s) identified as true. False negatives (FN) are the actual GB(s) identified as incorrect. Therefore, precision, recall (also known as sensitivity), F1 score, and specificity are computed as follows:

G-mean [38, 91] is the measure of the correlation and the overall efficiency of the activities. G-mean combines the recall and the specificity. A low G-Mean denotes a poor performance in classifying positive cases irrespective of whether the negative possibilities are precisely classified. The G-mean metric is vital to prevent overfitting the false-negative and underfitting the false-positive class.

G-mean is defined as:

A high G-mean signifies a preferable performance in the positive grading cases, while a low G-mean denote low performance [38]. These performance metrics properly evaluate the proposed approaches in locating GB(s), mainly on imbalanced HAR data sets. The evaluation performance for our methods is measured using evaluation metrics such as accuracy, precision, recall, harmonic mean , and G-mean [32, 93].

3.6. Statistical Analysis of GB Detection

Our proposed method can estimate the GB(s) in the HAR data set. The step for identifying the duration of each GB in the sequences are outlined below: (i)We first detect the number of bouts using our suggested algorithm(ii)We then estimate GB in HAR data set

The estimation of GB is obtained by computing the time (sec) of its occurrences. As explained earlier, GB is the length of time in seconds the GB takes place. GB is given as:

We also use different statistical techniques (Spearman’s rank correlation and mean square error) to estimate the total GB(s) detected. Spearman’s rank-order correlation (SROC) [94, 95] is the non-parametric version of the Pearson product-moment correlation. SROC can measure the capacity and order of relationship between two ranked variables. We can use SROC to estimate the correlation between the actual GB discovered and the predicted GB.

Given two sequences and , we can compute for every by firstly converting the two sequences into rank . Consequently, we then compute the difference between the two ranks of the sequences. is given as:

Subsequently, SROC is given as: where is the number of observations and . is the Spearman rank correlation coefficient. In our case, the actual GB(s) is represented as while the predicted GB(s) is described as .

The mean square error (MSE) can estimate the total GB(s) identified in HAR sequence [96]. MSE is the sum of the squared difference between actual and estimated GB(s). MSE is given as:

As explained earlier, the SROC value can show how two variables are correlated, while MSE can be referred to as a loss function used to estimate how efficient an algorithm is. The following section discusses the experimental result of our suggested method and previous martingale approaches.

4. Results and Discussion

This section discusses the proposed techniques used to identify GB in HAR. The section also describes the pre-processing methods and results obtained from experimentation using our proposed method and other baseline methods.

4.1. Cross-Validation Procedure

A cross-validation technique is implemented on two different time series called ds1 and ds2. Firstly, we implement our method (MTSM(PSO)) using the training data set (ds1) to obtain the best optimal parameter values that maximize G-mean. These parameter values are subsequently implemented to analyze the test data set (ds2). Both ds1 and ds2 consist of and data points. When we run the algorithms on the training set, our optimal parameter values that maximize G-mean are and . The MTMS(PSO) optimization process for the train set is illustrated in Figure 5(a). We then use these optimal parameter values on a test set (ds2). The confusion matrix (CM) for MTMS(PSO) performance on the test data set is illustrated in Table 3. Consequently, the exact process is repeated for both the MRPM(PSO) (see Figure 5(b)) and MGM(PSO) algorithm, first on the training set and then on the test data. MTMS(PSO) output is illustrated in Figure 6(a) while the corresponding MTMS(PSO) output on the test data set is shown in Figure 6(b). The CM for these approaches is presented in Tables 4 and 5 independently.

4.2. Confusion Matrix Analysis

To evaluate our algorithm performance, it is vital to note that the column of CM (Tables 35) shows the reference non-GB and GB, while the row gives the predicted non-GB and GB. Subsequently, we observe from Table 3 that of GB (walk upstairs) are discovered by the MTMS(PSO) approach. Also, the algorithm can identify of non-GB (standing). However, the algorithm only identifies non-GB as GB and GB as non-GB. Additionally, Table 4 illustrates that the MGM(PSO) technique is able to detect of GB and of non-locomotion, respectively. Nevertheless, the algorithm can discover of non-GB that are GB and GB that are non-locomotion independently. Lastly, Table 5 shows that MRPM(PSO) approach can detect of GB and of non-GB independently. Besides, the algorithm can discover of non-GB that are actual movement and of GB that are truly non-GB.

To summarize the analysis above, we can conclude that the MTMS(PSO) approach is more efficient as the percentage detection of GB is slightly higher (over ) compared to the MGM(PSO) and MRPM(PSO) separately. In addition, the false detection rate of the MTMS(PSO) method is lower than the baseline approaches.

The CM outlined above can be used to compute the evaluation metrics, which is summarized in Table 6 and shows the median and the interquartile range (interquartile range shows the range in values of lower quartile from the value of the upper quartile) of outcomes (accuracy, sensitivity, specificity, precision, F1, and G-mean) for both the training and test set.

4.3. Evaluation Metrics of GB Detection

Table 6 shows the evaluation metrics of our proposed and previous approaches, respectively. The accuracy, sensitivity, specificity, precision, F1 score, and G-mean for both the train and test data set are determined by finding the median of both results and the percentage value. The results in Table 6 are further simplified in Table 7. Table 7 shows the performance improvement (in percentage) of the proposed approach compared to MGM(PSO) and MRPM(PSO) methods, respectively. When we compare our suggested approach to MGM(PSO) method, we can observe that our method gives a better accuracy metric (over over) compared to that of the MGM(PSO) approach. Subsequently, our approach offers better sensitivity (over ) and specificity (over ) metrics compared to the MGM(PSO) technique. Also, our proposed approach gives a preferable precision (over ) compared to the MGM(PSO) method. Consequently, our methods produce an improved F1 (over ) and G-mean (over ) measure compared to the MGM(PSO) technique.

Consequently, we compare our suggested approach to the conventional MRPM(PSO) method and observe that our technique gives a better accuracy metric (over ). In addition, our process produces a preferable sensitivity (over ) and specificity (over ) metrics compared to the previous MRPM(PSO) technique. Also, our approach gives a better precision (over ) compared to that of MRPM(PSO) method. In addition, our methods produce an improved F1 (over ) and G-mean (over ) measure compared to the traditional martingale technique.

Also, we compare our work to that of Patil et al. [41] who proposed a method (ELM) that evaluates the performance of multi-class gait classification. In Table 8 we present the classification accuracy of classifying GB(s).

Generally, the ELM technique achieved a classification accuracy of in detecting the GB(s). However, our method achieved an overall accuracy rate in detecting GB in HAR obtained from smartphones. The advantages of our method over the ELM technique are given as follows: (i)Our model is a practical sequential GB detection approach to capture new HAR data points compared to the ELM approach(ii)The ELM can only analyze labelled data sets for gait pattern detection. However, our suggested algorithm can analyze unlabelled HAR data sets for GB discovery without supervision(iii)Our proposed method can adapt to changes in data patterns and

Overall, we can conclude that our method produces a superior outcome compared to the conventional ELM and MGM(PSO), respectively.

4.4. GB Detection Analysis

This section discusses the metric performance results for the evaluation of our proposed algorithm. The two metrics considered are the results of the Spearman test and mean square error (MSE). The Spearman test results between the estimated values of GB detected by our algorithm and that of the reference values were obtained as . Also, the Spearman test for the evaluated value of GB discovered by the MGM(PSO) technique and that of the reference values was acquired as . Consequently, the Spearman test for the GB identified by the MRPM(PSO) method to that of the reference or test value is established as . From the test, we can ascertain that the proposed MTMS(PSO) method gives a higher correlation than the baseline methods. The Spearman test is illustrated in Figures 7(a)7(c). Furthermore, the MSE obtained between the evaluated values of GB discovered and that of the reference points for MTMS(PSO), MGM(PSO), and MRPM(PSO) is , , and , respectively. The result shows that our proposed MTMS(PSO) is efficient (it produces a low error rate) compared to MGM(PSO) and MRPM(PSO), respectively. The low MSE rate is attributed to the fact that the suggested approach can diminish noise, leading to false positives effectively compared to the baseline methods. These results (SROC and MSE) are outlined in Table 9.

4.5. Computation Time

We also performed several runs (six) for the various algorithms (MRPM(PSO), MGM(PSO), and MTMS(PSO)) and presented the outcome in Table 10. The run iteration time for MRPM(PSO) is computationally efficient compared to MGM(PSO) and MTMS(PSO), respectively, as can be observed in Table 10. This evidence is attributed to the fact that MGM(PSO) and MTMS(PSO) are extensions of MRPM technique that adapt the original martingale method. The results show us that the computation time of executing MTMS(PSO) method is slightly higher (fraction of seconds) compared to the baseline methods (MRPM(PSO) and MGM(PSO)). However, we intend to further validate the proposed algorithm’s computational time by experimenting with other HAR data sets.

From experimentation analysis, we can conclude that our proposed MTMS(PSO) method produces better performance in detecting GB compared to MRPM(PSO) and MGM(PSO) approaches. The following section provides further discussion of the results.

5. Discussion

This paper developed an effective algorithm to identify GB(s) and estimate their durations using a smartphone attached to the wrist in unsupervised real-world situations. The goal of this study is to provide a reliable, cost-effective, and efficient detection method that would be able to detect GB(s) in HAR. Two different accelerometer data sets (ds1 and ds2) were utilized to validate the effectiveness of our proposed algorithm. Our approach can detect most of the GB(s) in the data set. Our recommended algorithm achieves a high level of accuracy overall and is extremely compelling from a diverse point of view. For instance, GB(s) detection was performed with a significant sensitivity of . Our algorithm’s discovery of GB(s) detection in HAR proved sufficient for walking categorization. There are very few miscategorizations experienced by our proposed approach between standing and walking upstairs. For instance, the misclassification of standing is while the misclassification for walking is . The precision, also called the positive predictive value of our proposed approach, is , while the F1 and G-mean values for the execution of the proposed method are and independently (see Table 6). These results confirmed that our suggested approach is computationally efficient and provides reliable details about GB(s) in HAR, suitable for a cost-effective and real-time computer-aided GB(s) discovery system. This system will enable clinicians to assess the movement of the elderly or persons with movement abnormalities. In Section 1, we presented some such systems and also briefly discussed how GB(s) study can help address these challenges. The following section shall elaborate on how this real-time feedback system can help improve gait impairment. Also, we shall link the results outlined above to these challenges.

5.1. Cost Effectiveness in Terms of Applicability in Discovering GB(s)

Our recommended healthcare systems can incorporate smartphone devices (for accumulating human recognition data), pre-processing procedure (for processing the collected time series), and the proposed algorithm application (for analyzing the time-varying data stream for GB detection). These innovative healthcare monitoring systems are noninvasive, low power, and low cost and will assist clinicians or caretakers in supplying long-term remote monitoring assistance, carrying out an early diagnosis, tracking abnormal movement symptoms over time, and categorizing or predicting pathological states [97].

5.2. Early Diagnosis of Gait Abnormalities

The possibility of identifying early signs of movement abnormalities, especially for the elderly, could be accomplished using our proposed algorithm. This early diagnosis could be realized by monitoring and studying GB of older individuals using our recommended method. The following outlines could address or manage gait abnormalities in a person (elderly) in a real-time feedback system or application using our suggested algorithm. (i)Healthcare researchers could track the movement of the elderly in real-time by monitoring and observing the daily activities of the elderly using our proposed health real-time system [2, 98, 99](ii)Our suggested system could estimate GB and non-GB activities achieved by an individual [100, 101]

This feedback from our proposed healthcare system could empower medical practitioners to measure the daily gait movement of the elderly [99, 102]. In addition, an unusual observation in the GB study outcome could prompt healthcare professionals to address the root cause of any concerns relating to gait disparity [103, 104]. These concerns might lead to further testing (on muscle strength, muscle tone, and coordination) and the medical solution’s implementation to find an immediate solution to the gait issue [105, 106].

5.3. The Effect of a Free-Living Environment on People with Movement Disabilities

Our proposed healthcare solution could enable the practical monitoring of the elderly or persons with gait disorder as a result of neurological or non-neurological causes [104, 107, 108]. These conditions can lead to poor coordination, unsteadiness, and staggering gait for the older generation [109]. Our proposed healthcare system could assess the gait movement of these individuals in their homes and surroundings through sensor devices, as explained earlier. The recommended healthcare application can also identify gait disorders by implementing our proposed healthcare system. Such healthcare systems could discover and analyze the GB of the older person to assess their movement and coordination daily [104]. This solution could enable healthcare workers to adjust their treatment and exercise levels based on these findings.

5.4. Tracking of Parkinson’s Disease Progression and Risk Prevention Measure

Gait can be described as a person’s pattern of walking. Gait disorder associated with Parkinson’s disease (PD) exhibits decrements in acceleration and stride, such as walking with a shortened sequence of two steps and low acceleration. Previously, the unified Parkinson’s disease rating scale (UPDRS) motor score was used to study motor symptoms of PD sufferers [110]. However, the score is not related to gait and posture improvement or deterioration. Therefore, there is a need for analysis of gait activity performed by PD patients using our suggested system to assess the degree to which PD affects the movement and posture of the patient. In addition, this gait assessment could assist healthcare professionals to enhance current practices that may aid in symptoms monitoring, rehabilitation processes, therapy strategies, risk assessment, and avoidance. In the following section, we will discuss the conclusion and future work.

6. Conclusion and Future Work

This paper briefly discusses the RPM approach, which can discover abnormalities in time series. We propose a feasible and efficient method in terms of cost and applicability. The approach involves using a smartphone to capture the pattern of human activity scenarios such as walking upstairs and standing. The smartphone is attached to the waist of 30 subjects of the age range from 19 to 48 years old. The accelerometer embedded in this smartphone can pick up movement via its sensitivity to triaxial orientation. Subsequently, we implement an unsupervised learning technique known as the MTMS(PSO) in the experimental setup. The suggested method can reduce frequency noise by adding a smoothing procedure (TEMA) to the processed accelerometer data. The method further used logical precedence and optimization technique to improve the accuracy and precision. The method does not use a window size that can be implemented to measure the accuracy and precision rate of some traditional methods.

The proposed approach can identify GB(s) in HAR data. The proposed MTMS(PSO) gives a slightly higher G-mean value (over ) compared to the MRPM(PSO). Our suggested algorithm was also compared with the previous MGM(PSO) approach. Experimentation shows that our method produces a better result (over 1.0%) in accuracy, specificity, precision, F1, and G-mean independently. In addition, our method outperforms the ELM approach.

Our recommended methods can study and measure gait irregularities or patterns in a person with movement disabilities. Also, our proposed method can be used to test how efficient deterrent remedies can improve gait disorder, especially in the elderly. Additionally, our system’s information on gait patterns analysis could ensure the implementation of effective clinical decisions to assist or monitor such pathological behaviour.

Further study will include the following: (i)We will also need to validate the approach using a wide range of HAR data streams to characterize GB in the movement of people with disabilities, especially the elderly(ii)To investigate the nature of GB associated with certain gait disabilities and how they can be distinguished in the HAR data set(iii)To examine the correlation between the extent of transition and gait abnormalities using MTMS(PSO) value(iv)To use our proposed healthcare system to assist people with gait abnormalities who demonstrate uncommon movements that involve the exchange of the upper and lower limb swinging in time and other joint dynamics(v)To implement and validate the algorithm’s performance on other time series from a different domain for anomaly or change detection(vi)To also improve the algorithm, thereby making it computationally efficient

Data Availability

The HAR data is accessed publicly from the UCI machine learning repository through the research work of Anguita et al. (Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra, and Jorge Luis Reyes-Ortiz. Energy-efficient smartphone-based activity recognition using fixed-point arithmetic. J. UCS,19(9):1295–1314, 2013.)

Disclosure

The funders had no role in the study’s design, in the collection, analyses, or interpretation of data, in the writing of the manuscript, or in the decision to publish the results.

Conflicts of Interest

The authors declare no conflict of interest.

Acknowledgments

This work was supported by a University of Ulster Vice-Chancellor’s Research Studentship. The authors would like to thank anonymous reviewers for their constructive suggestions.