Review Article

Academic Emotion Classification Using FER: A Systematic Review

Table 5

Summary of commonly used FER datasets.

DatasetType of emotionsData configuration

FER-20136 basic facial emotions+1 neutral emotion(i) Roughly 30,000 grayscale images
(ii) 48×48 scaled facial images with various facial expressions

JAFFE6 basic facial emotions +1 neutral emotion(i) 213 images of various facial expressions
(ii) Ten different Japanese females
(iii) Image resolution of

CK+Happiness, surprise, disgust, sadness, anger, fear, and contempt(i) 593 video sequences
(ii) 123 different participants with different genders and heritage aged between 18 to 50 years old
(iii) Captured at 30 FPS with a resolution of either or

KDEF6 basic facial emotions+1 neutral emotion(i) 4900 photos of human facial emotions from 70 individuals
(ii) Captured from five different angles
(iii) 35 males and 35 females aged between 20 to 30 years old
(iv) Image resolution of
(v) Subjects without beards, eyeglasses, earrings or moustaches, and no noticeable make-up

DISFAIntensity of 12 AUs coded(i) Stereo videos of 12 females and 15 males of various ethnic groups
(ii) Image resolution of
(iii) 66 facial landmark points

DISFA+5-level intensity of twelve FACS(i) Extension of the DISFA database
(ii) Manually labeled frame-based annotations of 5-level intensity for 12 FACS facial actions