BioMed Research International

BioMed Research International / 2012 / Article
Special Issue

Biometrics and Biosecurity

View this Special Issue

Research Article | Open Access

Volume 2012 |Article ID 834578 | 9 pages |

A Classification Method of Normal and Overweight Females Based on Facial Features for Automated Medical Applications

Academic Editor: Sabah Mohammed
Received22 May 2012
Accepted30 May 2012
Published05 Aug 2012


Obesity and overweight have become serious public health problems worldwide. Obesity and abdominal obesity are associated with type 2 diabetes, cardiovascular diseases, and metabolic syndrome. In this paper, we first suggest a method of predicting normal and overweight females according to body mass index (BMI) based on facial features. A total of 688 subjects participated in this study. We obtained the area under the ROC curve (AUC) value of 0.861 and kappa value of 0.521 in Female: 21–40 (females aged 21–40 years) group, and AUC value of 0.76 and kappa value of 0.401 in Female: 41–60 (females aged 41–60 years) group. In two groups, we found many features showing statistical differences between normal and overweight subjects by using an independent two-sample t-test. We demonstrated that it is possible to predict BMI status using facial characteristics. Our results provide useful information for studies of obesity and facial characteristics, and may provide useful clues in the development of applications for alternative diagnosis of obesity in remote healthcare.

1. Introduction

Obesity and overweight have become major health issues, because the prevalence of obesity has rapidly risen worldwide. The causes of this phenomenon are excessive ingestion of food, lack of physical activity, and environmental and genetic factors [1, 2]. Obesity and abdominal obesity are potential risk factors for insulin resistance and type 2 diabetes, cardiovascular diseases, stroke, ischemic heart disease, and metabolic syndrome [36], and many studies have investigated the relationship between obesity, disease, and body mass index (BMI) [713]. In the medical field and public health, BMI is commonly used as an indicator of overall adiposity. So, BMI is essential medical information for the prognostic prediction of diseases and clinical therapy. The principal cutoff points for underweight (<18.50 kg/m2), normal range (18.50–24.99 kg/m2), overweight or preobese (25.00–29.99 kg/m2), and obese (≥30.00 kg/m2) have been set by the World Health Organization (WHO).

A large number of studies on human face have focused on facial morphology, face recognition, and medicine [1423]. Facial characteristics provide clinical information on the present or future health conditions of patients. For example, the status of cheeks, neck circumference, and craniofacial morphology are associated with health complications, such as type 2 diabetes, hypertension, and sleep apnea [18]. Using computed tomographic (CT) scanning, Levine et al. [19] showed that the quantity of buccal fat is strongly related to visceral abdominal fat accumulation, based on the fact that patients with chubby facial cheeks tend to have upper-body obesity, and argued that plump cheeks of patients may be a high potential risk factor for metabolic complications related to obesity. Further, using facial measurements, Sadeghianrizi et al. [20] showed that craniofacial morphology is significantly different between normal and obese adolescents. They suggested that facial skeletal structures of obese adolescents tended to be relatively large, and that obesity was associated with bimaxillary prognathism.

The motivation for this study is conveyed by the following 2 questions: which features or facial characteristics are associated with overweight and normal BMI status? If we identify facial features that differ between normal and overweight, how accurately can we identify normal and overweight using these features? Contributions of this study are as follows. We first propose a method of classifying normal and overweight status using only facial characteristics. To date, no study has addressed a method that predicts BMI status using facial features. Furthermore, we introduce meaningful and discriminatory features that show a statistically significant difference between normal and overweight by statistical analysis, and identify compact and useful feature sets for BMI classification using facial features in female group. The results of this study will be useful in understanding the relationship between obesity-related diseases and facial characteristics.

2. Materials and Methods

2.1. Data Collection

A total of 688 subjects participated in this study. At the Korea Institute of Oriental Medicine, frontal and profile photographs of subjects’ faces with a neutral expression were acquired using a digital camera with a ruler (Nikon D700 with an 85 mm lens) and the subjects’ clinical information, such as name, age, gender, weight, height, blood pressure, and pulse were recorded. All images were captured at a resolution of 3 1 8 4 × 2 1 2 0 pixels in JPEG format. Height and weight of subjects were measured by a digital scale (GL-150; G Tech International Co., Ltd, Republic of Korea).

Based on identifiable feature points from the front and profile images of subjects, a total of 86 features were extracted. The extracted features included distance between points 𝑛 1 and 𝑛 2 in a frontal (or profile) image, vertical distance between 𝑛 1 and 𝑛 2 in a frontal (or profile) image, angles of 3 points 𝑛 1 , 𝑛 2 , and 𝑛 3 in a frontal (or profile) image, area of the triangle formed by the 3 points 𝑛 1 , 𝑛 2 , and 𝑛 3 in a profile image, and so forth. All points in a front and profile image are showed in Figure 1, and all the extracted features and brief descriptions are given in Table 1.

FeatureBrief description

FD 𝑛 1 _ 𝑛 2 Distance between points 𝑛 1 and 𝑛 2 in a frontal (or profile) image
FDH 𝑛 1 _ 𝑛 2 Horizontal distance between 𝑛 1 and 𝑛 2 in an image
FDV 𝑛 1 _ 𝑛 2 Vertical distance between 𝑛 1 and 𝑛 2 in an image
FA 𝑛 1 _ 𝑛 2 _ 𝑛 3 Angle of three points 𝑛 1 , 𝑛 2 , and 𝑛 3 in an image
FA 𝑛 1 _ 𝑛 2 Angle between the line through 2 points 𝑛 1 and 𝑛 2 and a horizontal line
FR02_psuFD(17, 26)/FD(18, 25)
FR03_psu(FD(18, 25) + FD(118, 125))/FDH(33, 133)
FR05_psuFDH(33, 133)/FD(43, 143)
FR06_psuFDH(33, 133)/FDV(52, 50)
FR08_psuFD(43, 143)/FDV(52, 50)
FArea02Area of the contour formed by the points 53, 153, 133, 194, 94, 33, and 53
FArea03Area of the contour formed by the points 94, 194, 143, 43, and 94
Fh_Cur_Max_DistanDistance between points 7 and 77 in a profile image
Fh_Angle_ 𝑛 1 _ 𝑛 2 Angle between the line through 2 points 𝑛 1 and 𝑛 2 and a horizontal line
Nose_Angle_ 𝑛 1 _ 𝑛 2 Angle between the line through 2 points 𝑛 1 and 𝑛 2 and a horizontal line
Nose_Angle_ 𝑛 1 _ 𝑛 2 _ 𝑛 3 Angle of 3 points 𝑛 1 , 𝑛 2 , and 𝑛 3 in a frontal(or profile) image
SAn 1_ 𝑛 2 Angle between the line through 2 points 𝑛 1 and 𝑛 2 and a horizontal line
Fh_Cur_Max_R79_69FD(77, 9)/FD(6, 9)
Nose_Area_ 𝑛 1 _ 𝑛 2 _ 𝑛 3 Area of the triangle formed by 3 points 𝑛 1 , 𝑛 2 , and 𝑛 3 in a profile image
EUL_L_el1 ~ EUL_L_el7Slope of the tangent at a point (el1 el7) in a frontal image
EUL_L_DHFDH(el1, el7)
EUL_L_MAXFDH(el1, 𝑒 𝑙 m a x )
EUL_L_RMAXFDH(el1, 𝑒 𝑙 m a x )/FDH(el1, el7)
EUL_L_SbFDV(el7, el1)/FDH(el7, el1)
EUL_L_StFDV( 𝑒 𝑙 m a x , el7)/FDH( 𝑒 𝑙 m a x , el7)
EUL_L_SfFDV( 𝑒 𝑙 m a x , el1)/FDH( 𝑒 𝑙 m a x , el1)
EUL_L_KhmeanAverage curvature of the left (or right) upper eyelid contour
EUL_L_khmaxMaximum curvature of the left (or right) upper eyelid contour
EUL_R_er1~ EUL_R_er7Slope of the tangent at a point (er1 er7) in a frontal image
EUL_R_DHFDH(er1, er7)
EUL_R_MAXFDH(er1, 𝑒 𝑟 m a x )
EUL_R_RMAXFDH(er1, 𝑒 𝑟 m a x )/FDH(er1, er7)
EUL_R_SbFDV(er7, er1)/FDH(er7, er1)
EUL_R_StFDV( 𝑒 𝑟 m a x , er7)/FDH( 𝑒 𝑟 m a x , er7)
EUL_R_SfFDV( 𝑒 𝑟 m a x , er1)/FDH( 𝑒 𝑟 m a x , er1)
EUL_R_KhmeanAverage curvature of the left (or right) upper eyelid contour
EUL_R_khmaxMaximum curvature of the left (or right) upper eyelid contour
PDH44_53Horizontal distance between 𝑛 1 and 𝑛 2 in a frontal (or profile) image

2.2. Normal and Overweight Cutoff Points

BMI was calculated as weight (kg) divided by the square of height (m) of the individual. Health consequences and BMI ranges of overweight and obesity are open to dispute [10, 24]. There is natural consequence. Physiological and environmental factors of race are associated with differences in BMI values and the assignment of BMI values for obesity and overweight depends on various factors, such as ethnic groups, national economic statuses, and rural/urban residence [8]. For instance, BMI values of a population in an Asian region tend to be lower than those of a population in a Western region; however, Asians have risk factors for cardiovascular disease and obesity-related diabetes at relatively low BMI values [11, 25]. In this study, we followed the suggestions of WHO to assign the cutoff point for each class in the Asia-Pacific region [25]. The proposed categories are as follows: normal, 18.5–22.9 kg/m2; overweight, ≥23 kg/m2.

Since the facial features and BMI are influenced by gender and age [26], participants were divided into 2 groups: female; 21–40 (females aged 21–40 years) and female: 41–60 (females aged 41–60 years). Detailed data and basic statistics of each group are presented in Table 2.

ClassFemale: 21–40Female: 41–60

Age32.1 (5.64)50.0 (5.42)
BMI22.2 (2.97)23.6 (2.86)

Age32.91 (5.29)50.31 (5.44)
BMI26.0 (2.75)25.6 (2.31)

For the selection of useful and discriminatory features, only features presenting 𝑃 -values < 0.05 in each group by an independent two-sample t-test were used in this study. In other words, only features with a 𝑃 value < 0.05 were included in classification experiments. Thus, features used in each group are different due to the difference of age. A detailed analysis of the statistical data and the selected features is presented in Section 3.2.

2.3. Preprocessing and Experiment Configurations

In the preprocessing step, the experiment was performed in 2 ways: (1) only the normalization method (scale 0~1 value) was applied to raw datasets, and (2) normalization and discretization were applied for better classification accuracy. We used the entropy-based multi-interval discretization (MDL) method introduced by Fayyad and Irani [27]. For classification performance evaluation, we used the area under the curve (AUC) and kappa as major evaluation criteria. Additionally, sensitivity, 1-specificity, precision, F-measure, and accuracy were used for detailed performance analysis. All the results were based on 10-fold cross-validation method for a statistical evaluation of learning algorithm. All experiments were conducted by Naive Bayes classifier in WEKA software [28], and statistical analyses were conducted by SPSS version 19 for Windows (SPSS Inc., Chicago, IL, USA).

3. Results and Discussion

3.1. Performance Evaluation

For brief summarization of performance evaluation, the AUC and kappa for the 2 groups with and without the use of MDL method (i.e., 2 ways of preprocessing) are depicted in Figure 2.

AUC values of the method using MDL in 2 female groups ranged from 0.760 to 0.861, whereas AUC of the method without the use of MDL ranged from 0.730 to 0.771. AUC and kappa values of the method using MDL showed improvements of 0.09 and 0.115, respectively, in the female 21–40 group, and 0.03 and 0.073, respectively, in female: 41–60.

Comparing AUC and kappa values, the classification performance of the method with MDL was higher than that of the method without MDL. These results showed that the BMI classification method of applying MDL was significantly better than that of not applying MDL.

The identification of normal and overweight in female: 41–60 group was more difficult than that of normal and overweight in female: 21–40 group. The exact reason behind this phenomenon is unknown, but obesity and menopause-related research studies offer some clues [2931]. Menopause leads to changes in fat tissue distribution, body composition, waist-to-hip ratio (WHR), and waist-to-height (W/Ht) in females. For instance, Douchi et al. [29] demonstrated that the lean mass of the head of premenopausal and postmenopausal females were not different, while trunk and legs were altered following menopause. Detailed results of the performance evaluation of each class and group are described in Tables 3 and 4. We think that these results imply the possibility of predicting normal and overweight status using human face information.


Female: 21–40Normal0.8840.3770.8520.86880.8%

Female: 41–60Normal0.6530.2530.6850.66870.4%


Female: 21–40Normal0.7880.3640.8420.81474.4%

Female: 41–60Normal0.6840.3540.620.6566.4%

3.2. Statistical Analysis of Facial Features

Statistical analysis of the comparison between normal and overweight classes was performed using an independent two-sample t-test, and a 𝑃 -value < 0.05 was considered statistically significant. Features with a 𝑃 -value < 0.05 in each group are described in Tables 5 and 6.

FeatureClassMean (Std.)t 𝑃 -value

FD17_26Normal9.473 (1.317)3.1180.002
Overweight8.941 (1.115)
FD117_126Normal9.483 (1.303)3.3190.001
Overweight8.904 (1.257)
FDH25_125Normal96.53 (5.116)−2.690.0076
Overweight98.52 (6.32)
FDH36_136Normal23.57 (2.469)−2.750.0064
Overweight24.46 (2.191)
FD18_25Normal29.94 (2.675)−2.0360.0428
Overweight30.68 (2.753)
FD43_143Normal125.2 (7.101)−8.6250.0000
Overweight133.6 (7.384)
FD53_153Normal145.4 (5.941)−5.9910.0000
Overweight150.7 (7.642)
FD94_194Normal140.1 (6.022)−8.8750.0000
Overweight147.6 (6.934)
FDH33_133Normal147.2 (5.63)−7.2610.0000
Overweight153.1 (7.02)
FA18_17_25Normal126.2 (6.591)−2.6840.0077
Overweight128.6 (6.75)
FA118_117_125Normal125 (7.339)−3.560.0004
Overweight128.3 (6.199)
FA18_25_43Normal95.38 (5.104)−3.7220.0002
Overweight97.91 (4.896)
FA118_125_143Normal96.16 (4.753)−3.3960.0008
Overweight98.39 (5.082)
FA18_17_43Normal76.97 (6.255)−4.390.0000
Overweight80.66 (6.108)
FA118_117_143Normal76.82 (6.824)−4.6440.0000
Overweight80.9 (5.583)
FA117_125Normal21.24 (3.645)3.9830.0001
Overweight19.19 (4.142)
FA17_18Normal34.01 (5.091)2.0020.0463
Overweight32.61 (5.32)
FR02_psuNormal0.318 (0.044)4.1990.0000
Overweight0.293 (0.041)
FR05_psuNormal1.178 (0.055)4.1830.0000
Overweight1.148 (0.048)
FR06_psuNormal2.039 (0.117)−5.3340.0000
Overweight2.123 (0.115)
FR08_psuNormal1.736 (0.151)−5.7830.0000
Overweight1.854 (0.147)
FArea02Normal6470 (644.4)−2.1060.0362
Overweight6654 (652.2)
FArea03Normal3596 (364.9)−5.6370.0000
Overweight3873 (361.9)
Fh_Cur_Max_DistanNormal3.654 (1.564)1.9840.0483
Overweight3.233 (1.585)
FDH12_14Normal18.58 (2.713)−3.0060.0029
Overweight19.69 (2.817)
Nose_Angle_14_12Normal61.07 (4.611)2.9460.0035
Overweight59.29 (4.108)
Nose_Angle_12_14_21Normal106.7 (4.634)2.3970.0172
Overweight105.1 (5.237)
EUL_L_el2Normal−0.637 (0.095)−3.1350.0019
Overweight−0.597 (0.087)
EUL_L_ el3Normal−0.22 (0.118)−3.2060.0015
Overweight−0.17 (0.11)
EUL_L_ el6Normal0.483 (0.105)3.4730.0006
Overweight0.432 (0.113)
EUL_L_DHNormal3.178 (0.248)−2.530.0120
Overweight3.268 (0.292)
EUL_L_SfNormal0.408 (0.106)2.4420.0153
Overweight0.371 (0.132)
EUL_R_er2Normal−0.63 (0.087)−3.9570.0001
Overweight−0.582 (0.095)
EUL_R_ er3Normal−0.208 (0.112)−2.8220.0051
Overweight−0.167 (0.1)
EUL_R_ er6Normal0.466 (0.106)2.4920.0133
Overweight0.43 (0.111)
EUL_R_ er7Normal0.647 (0.235)2.4320.0165
Overweight0.556 (0.29)
EUL_R_DHNormal3.188 (0.226)−4.2920.0000
Overweight3.322 (0.241)
EUL_R_RMAXNormal0.443 (0.069)2.0610.0403
Overweight0.424 (0.066)
EUL_R_StNormal−0.633 (0.117)−2.5250.0122
Overweight−0.592 (0.123)
EUL_R_SfNormal0.395 (0.106)2.4520.0149
Overweight0.36 (0.104)
EUL_R_KhmeanNormal0.024 (0.007)2.8680.0045
Overweight0.022 (0.007)
PDH44_53Normal89.38 (6.081)−3.0170.0028
Overweight91.79 (5.527)

FeatureClassMean (Std.)t 𝑃 -value

FDH25_125Normal94.63 (5.466)−3.0970.0021
Overweight96.29 (5.493)
FDH36_136Normal24.84 (2.283)−2.0550.0405
Overweight25.36 (2.805)
FD18_25Normal29.37 (3.287)−2.1990.0284
Overweight30.04 (2.923)
FD17_25Normal17.83 (2.717)−2.0760.0385
Overweight18.36 (2.471)
FD43_143Normal127.4 (6.471)−8.1840.0000
Overweight133.1 (7.721)
FD53_153Normal143.9 (6.343)−4.8480.0000
Overweight147.2 (7.141)
FD94_194Normal141.8 (6.01)−8.3850.0000
Overweight146.9 (6.485)
FDH33_133Normal146.8 (6.057)−6.6150.0000
Overweight150.9 (6.582)
FA18_25_43Normal99.88 (5.308)−2.5890.0100
Overweight101.2 (4.954)
FA118_125_143Normal99.74 (4.776)−4.3430.0000
Overweight101.9 (5.373)
FA117_125_143Normal124.7 (5.38)−2.4380.0152
Overweight126 (5.471)
FA18_17_43Normal81.11 (6.753)−2.6760.0077
Overweight82.85 (6.574)
FA118_117_143Normal80.69 (6.449)−3.6320.0003
Overweight83.16 (7.35)
FR02_psuNormal0.295 (0.044)2.1820.0297
Overweight0.285 (0.051)
FR05_psuNormal1.154 (0.046)3.9660.0001
Overweight1.135 (0.049)
FR06_psuNormal2.006 (0.104)−5.6880.0000
Overweight2.068 (0.121)
FR08_psuNormal1.743 (0.134)−5.9350.0000
Overweight1.827 (0.157)
FArea02Normal6358 (618.3)−2.2120.0275
Overweight6501 (696.7)
FArea03Normal3886 (397.6)−4.2450.0000
Overweight4052 (402.6)
FDV12_14Normal33.85 (3.313)2.5160.0123
Overweight33 (3.571)
FDH14_21Normal12.9 (1.633)2.1630.0311
Overweight12.53 (1.889)
Nose_Angle_14_21Normal45.73 (4.983)−2.4020.0168
Overweight46.98 (5.765)

In female: 21–40, 42 features were significantly different between normal and overweight classes ( 𝑃 < 0 . 0 5 ), and 11 of these features exhibited highly significant differences ( 𝑃 < 0 . 0 0 0 0 ). Four features concerning distances between 𝑛 1 and 𝑛 2 points in a frontal image (FD43_143, FD53_153, FD94_194, and FDH33_133 related to the mandibular width or face width) exhibited particularly significant differences. The features FA18_17_43 and FA118_117_143 representing the angles between three points 𝑛 1 (medial canthus), 𝑛 2 (midpoint of the upper eyelid), and 𝑛 3 (mandibular ramus) in a frontal image were highly significantly different. Comparing female: 21–40 and female: 41–60 groups, many features related to the eyelid were found in female: 21–40, but the features were not found in Female: 41–60. For instance, EUL_R_DH (horizontal distance from er1 to er7 in the eye image) was highly significantly different between the normal and overweight classes. The means of EUL_R_DH in normal and overweight status were 3.188 (0.226) and 3.322 (0.241) ( 𝑡 = 4 . 2 9 2 , 𝑃 = 0 . 0 0 0 0 ). In female: 41–60, a total of 21 features were significantly different between the normal and overweight classes, and 8 of these features were highly significantly different (FD43_143, FD53_153, FD94_194, FDH33_133, FA118_125_143, FR06_psu, FR08_psu, and FArea03; 𝑃 < 0 . 0 0 0 0 ).

Many features that were significantly different between the normal and overweight classes in particular age group were identified. 25 features such as EUL_R_St, FD117_126, Fh_Cur_Max_Distan, FDH12_14, EUL_R_DH, and EUL_R_Khmean were found only in the female: 21–40 group, while the features FD17_25, FA117_125_143, FDV12_14, FDH14_21, and Nose_Angle_14_21 were only found in female: 41–60.

3.3. Medical Applications and Limitations

Patients or potential patients with obesity-related diseases must constantly check their own BMI based on their weight. Measurements using calibrated scales and ruler are ideal, but may not always be possible in the critically ill [32] and in telemedicine or emergency medical services in real time in remote locations. Our method was designed under the prerequisite that above method cannot be used in situations such as elderly trauma or intensive care in emergency medicine, remote healthcare, and so forth.

Several studies have been performed on patient BMI and weight estimation in emergency medical service and telemedicine [3235]. These are important to enable accurate drug dosage, counter shock voltage calculation, or treatment, particularly in situations of serious illness, such as elderly trauma or intensive care [33, 34]. On the one hand, most patients are not aware of their body weight because the body weight of many individuals changes over time. For example, although patient self-estimates of weight are better than estimates by residents and nurses in emergency departments, 22% of patients do not estimate their own weight within 5 kg [34]. The method described herein can provide clues to the development of alternative methods for BMI estimation in the above situations or telemedicine, and the development of medical fields because facial characteristics provide substantial clinical information on the present or future health conditions of patients [18, 19].

4. Conclusions

The relationship between obesity, diseases, and face that are associated with health complications has been researched for a long time. Here, we have proposed and demonstrated the possibility of identifying normal and overweight status using only facial characteristics, and found statistically significant differences between the 2 classes in 2 female groups. Although there are still problems to be solved for the complete classification of BMI status, this method would provide basic information and benefits to studies in face recognition, obesity, facial morphology, medical science, telemedicine, and emergency medicine.


This work was supported in part by National Research Foundation of Korea (NRF) Grant funded by the Korea Government (MEST) (20110027738).


  1. O. H. James and J. C. Peters, “Environmental contributions to the obesity epidemic,” Science, vol. 280, no. 5368, pp. 1371–1374, 1998. View at: Publisher Site | Google Scholar
  2. A. G. Comuzzie and D. B. Allison, “The search for human obesity genes,” Science, vol. 280, no. 5368, pp. 1374–1377, 1998. View at: Publisher Site | Google Scholar
  3. J. P. Després and I. Lemieux, “Abdominal obesity and metabolic syndrome,” Nature, vol. 444, no. 7121, pp. 881–887, 2006. View at: Publisher Site | Google Scholar
  4. H. Hirose, T. Takayama, S. Hozawa, T. Hibi, and I. Saito, “Prediction of metabolic syndrome using artificial neural network system based on clinical data including insulin resistance index and serum adiponectin,” Computers in Biology and Medicine, vol. 41, no. 11, pp. 1051–1056, 2011. View at: Publisher Site | Google Scholar
  5. L. L. Yan, M. L. Daviglus, K. Liu et al., “BMI and health-related quality of life in adults 65 years and older,” Obesity Research, vol. 12, no. 1, pp. 69–76, 2004. View at: Google Scholar
  6. C. Ni Mhurchu, A. Rodgers, W. H. Pan et al., “Body mass index and cardiovascular disease in the Asia-Pacific Region: an overview of 33 cohorts involving 310 000 participants,” International Journal of Epidemiology, vol. 33, no. 4, pp. 751–758, 2004. View at: Publisher Site | Google Scholar
  7. T. Haas, S. Svacina, J. Pav, R. Hovorka, P. Sucharda, and J. Sonka, “Risk calculation of type 2 diabetes,” Computer Methods and Programs in Biomedicine, vol. 41, no. 3-4, pp. 297–303, 1994. View at: Google Scholar
  8. C. M. Y. Lee, S. Colagiuri, M. Ezzati, and M. Woodward, “The burden of cardiovascular disease associated with high body mass index in the Asia-Pacific region,” Obesity Reviews, vol. 12, no. 501, pp. e454–e459, 2011. View at: Publisher Site | Google Scholar
  9. L. Li, A. P. De Moira, and C. Power, “Predicting cardiovascular disease risk factors in midadulthood from childhood body mass index: utility of different cutoffs for childhood body mass index,” American Journal of Clinical Nutrition, vol. 93, no. 6, pp. 1204–1211, 2011. View at: Publisher Site | Google Scholar
  10. E. Anuurad, K. Shiwaku, A. Nogi et al., “The new BMI criteria for Asians by the regional office for the Western Pacific region of WHO are suitable for screening of overweight to prevent metabolic syndrome in elder japanese workers,” Journal of Occupational Health, vol. 45, no. 6, pp. 335–343, 2003. View at: Publisher Site | Google Scholar
  11. S. P. Hye, S. Y. Yeong, Y. P. Jung, S. K. Young, and M. C. Joong, “Obesity, abdominal obesity, and clustering of cardiovascular risk factors in South Korea,” Asia Pacific Journal of Clinical Nutrition, vol. 12, no. 4, pp. 411–418, 2003. View at: Google Scholar
  12. J. Y. Kim, H. M. Chang, J. J. Cho, S. H. Yoo, and S. Y. Kim, “Relationship between obesity and depression in the Korean working population,” Journal of Korean Medical Science, vol. 25, no. 11, pp. 1560–1567, 2010. View at: Publisher Site | Google Scholar
  13. H. Fonseca, A. M. Silva, M. G. Matos et al., “Validity of BMI based on self-reported weight and height in adolescents,” Acta Paediatrica, International Journal of Paediatrics, vol. 99, no. 1, pp. 83–88, 2010. View at: Publisher Site | Google Scholar
  14. K. Sobottka and I. Pitas, “A novel method for automatic face segmentation, facial feature extraction and tracking,” Signal Processing: Image Communication, vol. 12, no. 3, pp. 263–281, 1998. View at: Google Scholar
  15. Y. Wang, C. S. Chua, and Y. K. Ho, “Facial feature detection and face recognition from 2D and 3D images,” Pattern Recognition Letters, vol. 23, no. 10, pp. 1191–1202, 2002. View at: Publisher Site | Google Scholar
  16. C. L. Huang and Y. M. Huang, “Facial Expression Recognition Using Model-Based Feature Extraction and Action Parameters Classification,” Journal of Visual Communication and Image Representation, vol. 8, no. 3, pp. 278–290, 1997. View at: Google Scholar
  17. M. H. Yang, D. J. Kriegman, and N. Ahuja, “Detecting faces in images: a survey,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 1, pp. 34–58, 2002. View at: Publisher Site | Google Scholar
  18. E. N. Reither, R. M. Hauser, and K. C. Swallen, “Predicting adult health and mortality from adolescent facial characteristics in yearbook photographs,” Demography, vol. 46, no. 1, pp. 27–41, 2009. View at: Publisher Site | Google Scholar
  19. J. A. Levine, A. Ray, and M. D. Jensen, “Relation between chubby cheeks and visceral fat,” New England Journal of Medicine, vol. 339, no. 26, pp. 1946–1947, 1998. View at: Google Scholar
  20. A. Sadeghianrizi, C. M. Forsberg, C. Marcus, and G. Dahllöf, “Craniofacial development in obese adolescents,” European Journal of Orthodontics, vol. 27, no. 6, pp. 550–555, 2005. View at: Publisher Site | Google Scholar
  21. C. Frowd, C. Lee, A. Petkovic, K. Nawaz, and Y. Bashir, “Further automating and refining the construction and recognition of facial composite images,” International Journal of Bio-Science and Bio-Technology, vol. 1, no. 1, pp. 59–74, 2009. View at: Google Scholar
  22. C. D. Frowd, S. Ramsay, and P. J. B. Hancock, “The influence of holistic interviewing on hair perception for the production of facial composites,” International Journal of Bio-Science and Bio-Technology, vol. 3, no. 3, pp. 55–64, 2011. View at: Google Scholar
  23. M. Soltane, N. Doghmane, and N. Guersi, “Face and speech based multi-modal biometric authentication,” International Journal of Advanced Science and Technology, vol. 21, no. 6, pp. 41–56, 2010. View at: Google Scholar
  24. World Health Organisation, International Association for the Study of Obesity, International Obesity TaskForce, and The Asia-Pacific Perspective, “Redefining obesity and its treatment,” Health Communications, Sydney, Australia, 2000. View at: Google Scholar
  25. C. Barba, T. Cavalli-Sforza, J. Cutter et al., “Appropriate body-mass index for Asian populations and its implications for policy and intervention strategies,” Lancet, vol. 363, no. 9403, pp. 157–163, 2004. View at: Publisher Site | Google Scholar
  26. D. D. Pham, J. H. Do, B. Ku, H. J. Lee, H. Kim, and J. Y. Kim, “Body mass index and facial cues in Sasang typology for young and elderly persons,” Evidence-Based Complementary and Alternative Medicine, vol. 2011, Article ID 749209, 9 pages, 2011. View at: Publisher Site | Google Scholar
  27. U. M. Fayyad and K. B. Irani, “Multi-interval discretization of continuous-valued attributes for classification learning,” in Proceedings of the 13th International Joint Conference on Uncertainty in Artificial Intelligence, vol. 2, pp. 1022–1027, 1993. View at: Google Scholar
  28. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten, “The WEKA data mining software: an update,” SIGKDD Explorations, vol. 11, pp. 10–18, 2009. View at: Google Scholar
  29. T. Douchi, S. Yamamoto, S. Nakamura et al., “The effect of menopause on regional and total body lean mass,” Maturitas, vol. 29, no. 3, pp. 247–252, 1998. View at: Publisher Site | Google Scholar
  30. M. Skrzypczak and A. Szwed, “Assessment of the body mass index and selected physiological parameters in pre- and post-menopausal women,” HOMO- Journal of Comparative Human Biology, vol. 56, no. 2, pp. 141–152, 2005. View at: Publisher Site | Google Scholar
  31. Q. Wang, C. Hassager, P. Ravn, S. Wang, and C. Christiansen, “Total and regional body-composition changes in early postmenopausal women: age-related or menopause-related?” American Journal of Clinical Nutrition, vol. 60, no. 6, pp. 843–848, 1994. View at: Google Scholar
  32. D. Krieser, K. Nguyen, D. Kerr, D. Jolley, M. Clooney, and A. M. Kelly, “Parental weight estimation of their child's weight is more accurate than other weight estimation methods for determining children's weight in an emergency department?” Emergency Medicine Journal, vol. 24, no. 11, pp. 756–759, 2007. View at: Publisher Site | Google Scholar
  33. T. R. Coe, M. Halkes, K. Houghton, and D. Jefferson, “The accuracy of visual estimation of weight and height in pre-operative supine patients,” Anaesthesia, vol. 54, no. 6, pp. 582–586, 1999. View at: Publisher Site | Google Scholar
  34. W. L. Hall, G. L. Larkin, M. J. Trujillo, J. L. Hinds, and K. A. Delaney, “Errors in weight estimation in the emergency department: comparing performance by providers and patients,” Journal of Emergency Medicine, vol. 27, no. 3, pp. 219–224, 2004. View at: Publisher Site | Google Scholar
  35. S. Menon and A. M. Kelly, “How accurate is weight estimation in the emergency department?” Emergency Medicine Australasia, vol. 17, no. 2, pp. 113–116, 2005. View at: Google Scholar

Copyright © 2012 Bum Ju Lee et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

More related articles

2029 Views | 689 Downloads | 4 Citations
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

We are committed to sharing findings related to COVID-19 as quickly and safely as possible. Any author submitting a COVID-19 paper should notify us at to ensure their research is fast-tracked and made available on a preprint server as soon as possible. We will be providing unlimited waivers of publication charges for accepted articles related to COVID-19. Sign up here as a reviewer to help fast-track new submissions.