BioMed Research International

Volume 2016, Article ID 1675785, 11 pages

http://dx.doi.org/10.1155/2016/1675785

## The Performance of Short-Term Heart Rate Variability in the Detection of Congestive Heart Failure

^{1}Universidade CEUMA, No. 100, 65903-093 Imperatriz, MA, Brazil^{2}Laboratory for Biological Information Processing, Universidade Federal do Maranhão, S/N, São Luís, MA, Brazil^{3}Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Nagoya-shi 464-8603, Japan

Received 2 March 2016; Revised 13 June 2016; Accepted 26 July 2016

Academic Editor: Said Audi

Copyright © 2016 Fausto Lucena et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Congestive heart failure (CHF) is a cardiac disease associated with the decreasing capacity of the cardiac output. It has been shown that the CHF is the main cause of the cardiac death around the world. Some works proposed to discriminate CHF subjects from healthy subjects using either electrocardiogram (ECG) or heart rate variability (HRV) from long-term recordings. In this work, we propose an alternative framework to discriminate CHF from healthy subjects by using HRV short-term intervals based on 256 RR continuous samples. Our framework uses a matching pursuit algorithm based on Gabor functions. From the selected Gabor functions, we derived a set of features that are inputted into a hybrid framework which uses a genetic algorithm and -nearest neighbour classifier to select a subset of features that has the best classification performance. The performance of the framework is analyzed using both Fantasia and CHF database from Physionet archives which are, respectively, composed of 40 healthy volunteers and 29 subjects. From a set of nonstandard 16 features, the proposed framework reaches an overall accuracy of 100% with five features. Our results suggest that the application of hybrid frameworks whose classifier algorithms are based on genetic algorithms has outperformed well-known classifier methods.

#### 1. Introduction

Every year, congestive heart failure (CHF) related diseases are responsible for the death of millions of people around the world [1–3]. In this regard, large efforts are given to prolong the life of subjects [4]. Moreover, the treatment for cardiac pathologies is ranked amongst those with the highest cost for the healthcare system in low- and middle-income countries [1, 5]. Thus, Governments are enforcing the development of simple and low cost methods which can be able to detect heart failure on preventive exams. In fact, such an accomplishment would represent a breakthrough in the fight against life-threatening diseases [6].

At the clinical level, conventional methods to diagnose heart failure are based on a combination of tests (i.e., Valsalva maneuver, electrocardiography, echocardiography, and chest radiograph) and clinical history to determine whether or not the patient is afflicted with heart failure [7]. Among the tests used (i.e., Framingham, Duke, and Boston), the Boston criteria achieve sensitivity of 50% and specificity of 78%. Electrocardiography methods, such as electrocardiogram (ECG), through the analysis of abnormal ECGs reach sensitivity of 81.14% and specificity of 51.01% [8]. Echocardiograms show suboptimal values between 5% and 10% at rest and 20% and under stress [9]. As one can see, the current problem of the conventional diagnose methods is the considerable difference between the percentages of correct and incorrect initial diagnoses [10]. A direct consequence is that false-negatives will cause unnecessary tests, whereas the false-positives will have late diagnostic. The diagnoses reliability, however, might be increased if the screening test of heart failure could be assisted by signal processing techniques and biomedical analysis. In the past years, several works [11–15] have shown the possibility of classifying subjects with heart failure. For instance, Işler and Kuntalp (2007) using short-term heart rate variability (HRV) intervals have shown that normalizing classical HRV and entropy measures can lead to high levels of sensitivity (82.76%) and specificity (100%). Kampouraki et al. (2009) suggested that the classification accuracy of heartbeat time series can be highly improved and even reach maximum accuracy if support vector machines (SVM) are used. A joint wavelet and SVM, for example, yield one of the highest success rates (98.61%) during the task of classifying CHF from normal sinus rhythm (NSR) [14]. Thuraisingham (2009) using second-order difference plot of RR intervals reported the best success rate (100%), but at the cost of long-term RR intervals (24 hours). There is also a wide range of studies that use multiscale entropy (MSE) as fundamental parameter as discriminative power [16]. As an example, a recent work has proposed the use of the reduced data dual-scale metrics in which the accuracy power has reached 100% using 500 RR samples (10 minutes of ECG recordings) [17]. Yet, measures based on MSE are heavily biased on the number of samples, scales, and block analysis. A method based on classification and regression has shown a promising use of the short-term intervals. It demonstrates that sensitivity and specificity could reach, respectively, 89.7 and 100% by taking into account the average variation over 24 hours of consecutive heartbeat intervals [18]. Despite the number of sample tests and methodology used, the proposed techniques have different degrees of complexities. Specifically, they emphasize uncovering patterns that could be used to predict sudden death caused by heart failure. One interesting view of this problem is to find a representation that could be considered the representative pattern subserving the genesis of the autonomic cardiac control. In [19], for example, the authors show that it is possible to segregate cardiopathies by scaling the behavior of heartbeat intervals using wavelets.

Choosing what structures should be discarded or maintained during the analysis of ECG signals is a standard problem in clinical diagnosis. In this regard, one should comprehend the nature of the signal to infer the relevance of the structures composing its pattern. In this case, a common strategy to solve this problem has been to find patterns that are likely to appear when we are facing clinical alterations on subjects under observation. Usually a specialist needs to spend a longer time and effort analyzing data. Herein, we propose an alternative solution. A method that could help to predict congestive heart failure based on the analysis of short-term RR intervals (5 minutes of ECG). With recent advances on computer-aided detection and diagnosis systems, the need of simple and accurate methods plays an important role, especially in telemedicine. The novelty described here shows the capacity of indicating the presence or absence of a cardiac disease. Yet, our methodology can be extended to other areas, such as detection of breast cancer [20], diabetes [21], and even distinguishing different modalities of motor imagery based on EEGs analysis [22]. Last but not least, our idea is also patients in remote areas, that is, where one does not have easy access to diagnosing tools. For instance, there are areas where there is only an ECG available and usually no specialist, but a general clinician (a problem that we currently see in some rather poorer regions in Brazil) [6].

This paper is described in the following sections, where Section 2 covers the matching pursuit algorithm. Section 3 describes the database used. Sections 4 and 5, respectively, explain the feature extraction and feature subset selection. The overview of the system is given in Section 6. At last, discussion, results, and conclusions can be found from Sections 7 to 9.

#### 2. The Matching Pursuit Algorithm

Several models of autonomic cardiac regulation are based either on the analysis of input-output relationship [23–25] or on the idea of selective frequency extraction [26]. Altogether, they often explore the standard frequency division suggested to analyze the HRV signals [27]. A simple way to accomplish this task is to use the Fourier transform or autoregressive methods (AR). A drawback, however, is that Fourier and AR methods are not robust to nonstationarity. An alternative way has been to use time and frequency transformations to overcome nonstationarity. Essentially, one can drop the nonstationarity problem by selecting a function that decomposes a signal into a sequence of bases using adaptive time-frequency transform (ATFT) algorithms. This approach is accomplished by scaling, translating, and modulating versions of the basis function, such that they represent the decomposed signal with a well-defined time and frequency distribution. For instance, ATFT algorithms have drawn a lot of attention in pattern classification [28] and signal compression due to their capacity of reducing a higher dimension space to a few numbers of parameters. One of the most used ATFT algorithms exploits a matching pursuit (MP) decomposition [29, 30]. The MP framework represents a signal as a linear combination of basis functions drawn from an overcomplete dictionary , where , or alternativelyin which can be Gabor functions described aswhere means modulatory coefficient, is scale, is frequency modulation, is translation, is phase, and is a normalization factor, such that . Based on previous studies [31], we know that the structures underlying the heartbeat intervals components have a Gabor-like representation. Using the MP based on the decomposition of the heartbeat intervals by Gabor functions, it is possible capture representations in terms of coherent and noncoherent structures [32]. In one hand, coherent structures can be understood as the Gabor functions (which compose the dictionary) that have the highest correlation with the decomposed interval. On the other hand, noncoherent structures are likely to represent noise-like random structures which are not well defined in terms of time and frequency representation. They are likely to have small correlation with the decomposed interval.

The MP decomposes by finding the best orthogonal projections amongst a set of basis functions from a dictionary that matches the structure of . It results in a finite number of basis functions organized in decreasing order of energy.

A fundamental aspect of MP algorithm is how the signal is decomposed [32]. That is, because not all the signals are composed of well-defined (coherent) components, the MP tends to decompose coherent underlying structures first and then break random spike-like noise structures into a set of basis functions whose time and frequency distribution are less compact than coherent ones. Figure 1 illustrates an example of MP decomposition using CHF and NSR HRV waveforms followed by their time-frequency representation. It shows remarkable differences between time and frequency plane. Such differences are likely to be associated with the temporal variations of HRV intervals.