Abstract
Introduction: The study of facial emotion recognition is under-explored in subjects with mild cognitive impairment (MCI). We investigated whether deficits in facial emotion recognition are present in patients with MCI. We also analyzed the relationship between facial emotion recognition and different domains of cognitive function. Methods: This study included 300 participants aged 60 years or older with cognitive decline. We evaluated 181 MCI and 119 non-MCI subjects using the Seoul Neuropsychological Screening Battery-Core (SNSB-C) and facial emotion recognition task using six facial expressions (anger, disgust, fear, happiness, sadness and surprise). A Generalized Linear Model (GLM) was used to assess the association between cognitive performance and accuracy of facial emotion recognition and to compare facial emotion recognition in the MCI group based on the impairment of five different domains of cognitive function. The model was adjusted for age, sex, years of education, and depressive symptoms. Results: Patients with MCI had a lower score for accurately recognizing total facial emotion (0.48 vs. 0.53; ρ = 0.0003) and surprise (0.73 vs. 0.81; ρ = 0.0215) when compared to cognitively healthy subjects. We also discovered that frontal/executive function domain (Digit Symbol Coding [DSC, 0.38 vs. 0.49; p < 0.0001], Controlled Oral Word Association Test [COWAT, 0.42 vs. 0.49; p = 0.0001], Korean-Trail Making Test [K-TMT, 0.37 vs. 0.48; p = 0.0073], Korean-Color Word Stroop Test [K-CWST, 0.43 vs. 0.49; p = 0.0219]) and language domain (Korean-Boston Naming Test [S-K-BNT, 0.46 vs. 0.47; p = 0.003]) were statistically associated with the deficits of facial emotion recognition in patients with MCI. Conclusion: We observed a significant association between deficits in facial emotion recognition and cognitive impairment in elderly individuals.
Introduction
Over the next 5 decades, the aging population is predicted to increase by 21% [1]. The aging population is expected to increase by 140% and 51% in the developing and developed countries, respectively [1]. Considering this projected increase, preventing dementia is emphasized as a public health priority in contemporary society [2]. Mild cognitive impairment (MCI) is a condition in which cognitive function deteriorates, thereby impacting the ability to perform specific tasks such as (but not limited to) memory and attention. MCI is a transitional stage of dementia [1‒3]. Each year, 10–15% of individuals with MCI develop dementia [1‒3].
With the growth in the elderly population, the prevalence of MCI and the transition rate of MCI to dementia are anticipated to increase. However, with appropriate diagnosis and treatment, MCI can revert to normal conditions without further progression to dementia [1]. Thus, early detection of MCI is recommended to prevent dementia by decelerating or terminating its progression of developing dementia in individuals [2].
Much attention has been focused on identifying early signs of cognitive impairment [4, 5]. An emerging research topic is the consideration of the significant differences in facial emotion recognition between cognitively impaired patients (or Alzheimer’s disease [AD] patients) and non-impaired patients [6‒8]. In general, the ability to identify an emotion through the recognition of a facial expression is considered an important aspect of social cognition. The inability to recognize others’ emotions has been reported in many psychiatric and neurological disorders including autism, schizophrenia, behavioral variant frontotemporal dementia, and AD [9‒11]. Previous studies have shown greater deficits in facial emotion recognition (particularly in negative emotions, such as anger, sadness, and fear) in patients with MCI or AD than in cognitively healthy individuals [12‒16]. Teng et al. [17] suggested that facial emotion processing could be impaired in patients with MCI prior to more marked cognitive deficits. Therefore, the neurological components involved in emotional processing appear to be affected by cognitive impairment [18].
Despite these studies, whether cognitive impairment is associated with deficits in facial emotion recognition and which brain regions are possibly linked to these associations have not been fully investigated. In this study, we examined the association between facial emotion recognition and cognitive function by comparing participants with MCI to those with normal cognition. We also observed which cognitive domain is associated with facial emotion recognition to identify the linked brain regions. By focusing on MCI, a condition that is reversible to normal cognition, unlike dementia, facial emotion recognition can be suggested as a quick means of discovering cognitive impairment and preventing the development of an irreversible stage of dementia by effectively treating patients with MCI upon diagnosis.
Methods
Study Population
The target population for this study was individuals aged 60 years or older who visited the Department of Neurology at the Veterans Health Service Medical Center (Seoul, Republic of Korea) between March and September 2021. The inclusion criteria were as follows: (1) patients who complained of cognitive decline, (2) patients who could independently complete the clinical tests and questionnaires, and (3) patients who agreed to participate in the study. The exclusion criteria were as follows: (1) patients diagnosed with dementia (ICD-10: F00-F09, G30), (2) patients diagnosed with brain infarction, cerebral hemorrhage, or Parkinson’s disease (3) patients suffering from another serious disease (e.g., cancer or mental illness), and (4) patients on medication affecting cognitive function. Experienced neurological clinicians evaluated the inclusion and exclusion criteria. The participants completed a neuropsychological assessment and facial emotion recognition task. A total of 341 participants volunteered for the study and provided informed consent upon enrollment. Of these, 41 participants dropped out of the study due to incomplete neuropsychological test (n = 9) or refusal to complete the facial emotion recognition task (n = 38). Among the 41 participants who dropped out, six did not complete both neuropsychological test and facial emotion recognition task. After applying the exclusion criteria, 300 participants (87.97%) were eligible for participation in the study.
Neuropsychological Assessment
MCI was diagnosed by two physicians based on clinical and neuropsychological assessment. Neuropsychological performance was assessed using the Seoul Neuropsychological Screening Battery-Core (SNSB-C). The SNSB-C comprises 14 tasks that examine domains of cognitive function, including attention, language, memory, visuospatial function, and frontal/executive function [19‒21]. These are Vigilance Test and Digit Span Test (DST) for attention; Short Form of the Korean-Boston Naming Test (K-BNT) for language; Elderly’s Version of Seoul Verbal Learning Test (SVLT) for memory; Rey Complex Figure Test (RCFT) for visuospatial function; and Digit Symbol Coding (DSC), Controlled Oral Word Association Test (COWAT), Elderly’s Version of Korean-Trail Making Test (K-TMT), and Korean-Color Word Stroop Test (K-CWST) for frontal/executive function [19‒21]. The SNSB-C validated the composite score of SNSB-C to differentiate MCI and dementia patients along with normal aging [20]. The composite score of the SNSB-C is indicated as a z-score standardized for age, sex, and years of education. Based on the percentile of the SNSB-C, participants were placed in either the MCI or non-MCI group. Participants scoring above the 16th percentile in all tasks of the SNSB-C were categorized as non-MCI, whereas those scoring lower than or equal to the 16th percentile in one or more tasks were characterized as MCI [19‒21].
Facial Emotion Recognition Task
The expressions for facial emotion recognition task were used from the “ChaeLee Korean Facial Expressions of Emotion,” which was developed and standardized by the Catholic University of Korea College of Medicine [22]. For the forced-choice emotion recognition task, six facial expressions (anger, disgust, fear, happiness, sadness, and surprise) categorized by Ekman [23] were presented on a computer screen. When presented with an image, the participants stated a specific emotion displayed on the screen and indicated their responses by pressing the number key with the corresponding emotion. Participants had to respond to the image as quickly as possible. The images were randomly displayed within one block (six facial expressions of 4 men and 4 women; a total of 24 images). All the participants had the opportunity to complete two practice blocks. After confirming the participants’ understanding of the procedure, an actual task was performed (24 trials). The facial stimuli were set to appear for 8,000 ms, and the interval between trials was set to 1,000 ms PsychoPy2 was used to assess and create facial emotion recognition and related tasks.
Variables of Interest
The demographic covariates included age, sex, years of education, and depressive symptoms. The Short Version of Geriatric Depression Scale (SGDepS) was used to evaluate depressive symptoms. The SGDepS is a validated 15-item questionnaire on mood, energy, anxiety, hopefulness, satisfaction, inattention, and insomnia, with “yes” or “no” responses. The cut-off score for the SGDepS is 8, with a score of 8 or above indicating depressive symptoms. The SGDepS was found to have sufficient internal consistency reliability (alpha = 0.86) and retest reliability (r = 0.81) to support its use as a clinical instrument [24].
Statistical Analysis
Continuous variables (age, years of education, SNSB score, and six facial expressions) were compared between the MCI and non-MCI groups using a t test. Categorical variables (i.e., sex) between the two groups were compared using the χ2 test. Pearson correlation analysis was conducted to determine the correlation between cognitive test scores and facial emotion recognition. Furthermore, a Generalized Linear Model (GLM) was created to evaluate the association between cognitive performance and facial emotion recognition. The model was adjusted for age, sex, years of education, and depressive symptoms. Additionally, the effect size was calculated to show the differences between the MCI and non-MCI groups [25]. We compared facial emotion recognition in the MCI group according to the impairment of five cognitive domains (attention, language and relation function, visuospatial function, memory, and frontal/executive function) in the SNSB-C. All analyses were performed using the Statistical Analysis System version 9.2 (SAS Institute, Cary, NC, USA), and statistical significance was set at p ≤ 0.05.
Results
Participant Characteristics
Table 1 shows the characteristics of the study population stratified by cognitive status. Of the 300 participants, 181 (60.33%) had MCI and 119 had normal cognition. For the MCI group, the mean age and years of education were 75.08 years and 10.63 years, respectively, and 34.33% were male. For the non-MCI group, the mean age and years of education were 74.62 years and 10.86 years, respectively, and 19% were male. There was no significant difference in age, sex, and years of education between the MCI and non-MCI groups. For depressive symptoms, the score for SGDepS was 6.03 for the MCI group and 5.29 for the non-MCI group. However, there were no significant differences in depressive symptoms between the MCI and non-MCI groups.
. | MCI (n = 181) . | Non-MCI (n = 119) . | p value . |
---|---|---|---|
Age, mean (SD), years | 75.08 (5.58) | 74.62 (5.23) | 0.4787 |
Sex, n (%) | 0.1557 | ||
Male | 103 (64.38) | 57 (35.62) | |
Female | 78 (55.71) | 62 (44.29) | |
Educations, mean (SD), years | 10.63 (4.38) | 10.86 (4.10) | 0.6465 |
SGDepS, mean (SD) | 6.03 (3.91) | 5.29 (3.21) | 0.0740 |
. | MCI (n = 181) . | Non-MCI (n = 119) . | p value . |
---|---|---|---|
Age, mean (SD), years | 75.08 (5.58) | 74.62 (5.23) | 0.4787 |
Sex, n (%) | 0.1557 | ||
Male | 103 (64.38) | 57 (35.62) | |
Female | 78 (55.71) | 62 (44.29) | |
Educations, mean (SD), years | 10.63 (4.38) | 10.86 (4.10) | 0.6465 |
SGDepS, mean (SD) | 6.03 (3.91) | 5.29 (3.21) | 0.0740 |
Values are expressed as mean (standard deviation) or n (%).
SGDepS, Short Version of Geriatric Depression Scale.
Comparison of Neuropsychological Test between MCI and Non-MCI
Table 2 shows raw scores and percentile scores for the neuropsychological test of SNSB-C between the MCI and non-MCI groups. Overall, the MCI group demonstrated lower raw scores and percentile scores across all tasks of the neuropsychological test with a statistical significance (p < 0.0001) when compared to the non-MCI group.
. | MCI (N = 181) . | Non-MCI (N = 119) . | p value . |
---|---|---|---|
mean (SD) . | mean (SD) . | ||
Raw scores | |||
Frontal/executive function | |||
DSC | 40.41 (14.15) | 50.28 (13.15) | <0.0001 |
COWAT | 12.79 (3.98) | 16.16 (4.24) | <0.0001 |
K-TMT | 69.87 (58.06) | 40.59 (18.28) | <0.0001 |
K-CWST | 80.24 (21.41) | 92.69 (17.54) | <0.0001 |
Memory | |||
SVLT | 3.49 (2.57) | 6.44 (1.94) | <0.0001 |
Visuospatial function | |||
RCFT | 27.85 (5.10) | 31.66 (2.37) | <0.0001 |
Language | |||
S-K-BNT | 11.74 (2.18) | 12.86 (1.73) | <0.0001 |
Attention | |||
DST | 8.72 (2.04) | 10.35 (2.08) | <0.0001 |
Percentile scores | |||
Frontal/executive function | |||
DSC | 48.44 (28.00) | 72.03 (23.52) | <0.0001 |
COWAT | 35.64 (25.41) | 63.37 (25.43) | <0.0001 |
K-TMT | 54.80 (24.80) | 74.09 (13.47) | <0.0001 |
K-CWST | 34.50 (26.86) | 63.56 (23.50) | <0.0001 |
Memory | |||
SVLT | 26.23 (26.30) | 57.90 (24.46) | <0.0001 |
Visuospatial function | |||
RCFT | 24.90 (22.66) | 47.92 (21.73) | <0.0001 |
Language | |||
S-K-BNT | 54.97 (29.13) | 68.93 (23.07) | <0.0001 |
Attention | |||
DST | 37.81 (26.45) | 61.76 (26.05) | <0.0001 |
. | MCI (N = 181) . | Non-MCI (N = 119) . | p value . |
---|---|---|---|
mean (SD) . | mean (SD) . | ||
Raw scores | |||
Frontal/executive function | |||
DSC | 40.41 (14.15) | 50.28 (13.15) | <0.0001 |
COWAT | 12.79 (3.98) | 16.16 (4.24) | <0.0001 |
K-TMT | 69.87 (58.06) | 40.59 (18.28) | <0.0001 |
K-CWST | 80.24 (21.41) | 92.69 (17.54) | <0.0001 |
Memory | |||
SVLT | 3.49 (2.57) | 6.44 (1.94) | <0.0001 |
Visuospatial function | |||
RCFT | 27.85 (5.10) | 31.66 (2.37) | <0.0001 |
Language | |||
S-K-BNT | 11.74 (2.18) | 12.86 (1.73) | <0.0001 |
Attention | |||
DST | 8.72 (2.04) | 10.35 (2.08) | <0.0001 |
Percentile scores | |||
Frontal/executive function | |||
DSC | 48.44 (28.00) | 72.03 (23.52) | <0.0001 |
COWAT | 35.64 (25.41) | 63.37 (25.43) | <0.0001 |
K-TMT | 54.80 (24.80) | 74.09 (13.47) | <0.0001 |
K-CWST | 34.50 (26.86) | 63.56 (23.50) | <0.0001 |
Memory | |||
SVLT | 26.23 (26.30) | 57.90 (24.46) | <0.0001 |
Visuospatial function | |||
RCFT | 24.90 (22.66) | 47.92 (21.73) | <0.0001 |
Language | |||
S-K-BNT | 54.97 (29.13) | 68.93 (23.07) | <0.0001 |
Attention | |||
DST | 37.81 (26.45) | 61.76 (26.05) | <0.0001 |
Comparison of Facial Emotion Recognition between MCI and Non-MCI
Table 3 shows a comparison of the mean and standard deviation (SD) values for six facial expressions for recognition (anger, disgust, fear, happiness, sadness, and surprise) according to the MCI and non-MCI groups. The non-MCI group outperformed the MCI group in facial emotion recognition. Significant differences in the recognition of total facial emotion (mean: 0.47 vs. 0.53, p < 0.0001) and surprise (mean: 0.73 vs. 0.81, p = 0.0087) were detected between the MCI and non-MCI groups. After adjusting for age, sex, years of education, and depressive symptoms, we calculated the adjusted mean of facial emotion recognition. Even after adjusting for age, sex, years of education, and depressive symptoms, the MCI group performed worse than the non-MCI group in recognizing total facial emotion (mean: 0.48 vs. 0.53, p = 0.0003) and surprise (mean: 0.73 vs. 0.81, p = 0.0215). Similarly, the effect size of total facial emotion (0.44 vs. 0.37) and surprise (0.29 vs. 0.29) are expressed with a relatively high value compared to other facial expressions before and after the adjustment.
. | MCI (N = 181) . | Non-MCI (N = 119) . | p value . | Effect size . |
---|---|---|---|---|
mean (SD) . | mean (SD) . | |||
Unadjusted | ||||
Facial emotion recognition | 0.47 (0.15) | 0.53 (0.12) | <0.0001 | 0.44 |
Anger | 0.39 (0.25) | 0.42 (0.24) | 0.3931 | 0.12 |
Disgust | 0.31 (0.32) | 0.37 (0.31) | 0.0941 | 0.19 |
Fear | 0.17 (0.22) | 0.22 (0.23) | 0.0925 | 0.22 |
Happiness | 0.86 (0.22) | 0.90 (0.17) | 0.0984 | 0.20 |
Sadness | 0.73 (0.32) | 0.79 (0.25) | 0.0752 | 0.21 |
Surprise | 0.73 (0.31) | 0.81 (0.24) | 0.0087 | 0.29 |
Adjusted | ||||
Facial emotion recognition | 0.48 | 0.53 | 0.0003 | 0.37 |
Anger | 0.40 | 0.42 | 0.5483 | 0.04 |
Disgust | 0.31 | 0.37 | 0.1166 | 0.37 |
Fear | 0.18 | 0.22 | 0.1394 | 0.04 |
Happy | 0.87 | 0.90 | 0.2623 | 0.19 |
Sadness | 0.74 | 0.79 | 0.2178 | 0.14 |
Surprise | 0.73 | 0.81 | 0.0215 | 0.29 |
. | MCI (N = 181) . | Non-MCI (N = 119) . | p value . | Effect size . |
---|---|---|---|---|
mean (SD) . | mean (SD) . | |||
Unadjusted | ||||
Facial emotion recognition | 0.47 (0.15) | 0.53 (0.12) | <0.0001 | 0.44 |
Anger | 0.39 (0.25) | 0.42 (0.24) | 0.3931 | 0.12 |
Disgust | 0.31 (0.32) | 0.37 (0.31) | 0.0941 | 0.19 |
Fear | 0.17 (0.22) | 0.22 (0.23) | 0.0925 | 0.22 |
Happiness | 0.86 (0.22) | 0.90 (0.17) | 0.0984 | 0.20 |
Sadness | 0.73 (0.32) | 0.79 (0.25) | 0.0752 | 0.21 |
Surprise | 0.73 (0.31) | 0.81 (0.24) | 0.0087 | 0.29 |
Adjusted | ||||
Facial emotion recognition | 0.48 | 0.53 | 0.0003 | 0.37 |
Anger | 0.40 | 0.42 | 0.5483 | 0.04 |
Disgust | 0.31 | 0.37 | 0.1166 | 0.37 |
Fear | 0.18 | 0.22 | 0.1394 | 0.04 |
Happy | 0.87 | 0.90 | 0.2623 | 0.19 |
Sadness | 0.74 | 0.79 | 0.2178 | 0.14 |
Surprise | 0.73 | 0.81 | 0.0215 | 0.29 |
Adjusted mean included an adjustment for age, sex, years of education, and depressive symptoms.
Correlation between Facial Emotion Recognition and Cognitive Function
Table 4 shows the Pearson correlation coefficients between the six facial expressions in the recognition and cognitive function tests. Regarding the cognitive function of elderly individuals, facial emotion recognition showed a significant correlation with four facial expressions: disgust (r = 0.1337; p = 0.0221), happiness (r = 0.2137; p = 0.0002), sadness (r = 0.1356; p = 0.0187), and surprise (r = 0.1303; p = 0.0240).
Facial emotion recognition (n = 300) . | Anger . | Disgust . | Fear . | Happiness . | Sadness . | Surprise . |
---|---|---|---|---|---|---|
Neuropsychology test (SNSB-C %ile score) | 0.0813 (0.1604) | 0.1337 (0.0221) | 0.4537 (0.4345) | 0.2137 (0.0002) | 0.1356 (0.0187) | 0.1303 (0.0240) |
Facial emotion recognition (n = 300) . | Anger . | Disgust . | Fear . | Happiness . | Sadness . | Surprise . |
---|---|---|---|---|---|---|
Neuropsychology test (SNSB-C %ile score) | 0.0813 (0.1604) | 0.1337 (0.0221) | 0.4537 (0.4345) | 0.2137 (0.0002) | 0.1356 (0.0187) | 0.1303 (0.0240) |
Neuropsychological performance was assessed using the Seoul Neuropsychological Screening Battery-Core (SNSB-C). Values of SNSB-C are expressed in %ile score.
Comparison of Facial Emotion Recognition in MCI Group according to Cognitive Tasks in SNSB-C
Figure 1 shows the mean and SD values of facial emotion recognition in the MCI group. There are significant differences in K-BNT (mean: 0.46 vs. 0.47; p < 0.003) DSC (mean: 0.38 vs. 0.49; p < 0.0001), COWAT (mean: 0.42 vs. 0.49; p = 0.0001), K-TMT (mean: 0.37 vs. 0.48; p = 0.0073), and K-CWST (mean: 0.43 vs. 0.49; p = 0.0219) between the MCI and non-MCI groups. All cognitive tasks involved frontal/executive function as the cognitive domain except for the K-BNT, which corresponds to language function. Figure 1 demonstrates which brain domains/regions are associated with facial emotion recognition based on the results of SNSB-C tasks among MCI participants. By further dividing MCI participants into impaired and non-impaired according to specific cognitive tasks of SNSB-C, the brain functions responsible for facial emotion recognition can assume specific brain domains/regions associated with facial emotion recognition in cognitive impairment. The vertical axis represents the mean value of facial emotion recognition.
Discussion
We found that patients with MCI had increased inaccuracy in overall facial emotion recognition and a specific facial expression of surprise compared to healthy participants without MCI. We also found that among the cognitive domains, executive function was mainly associated with deficits in facial emotion recognition in patients with MCI. It is possible that the process of facial emotion recognition requires cognitive flexibilities and inhibitory control to focus attention on the relevant features of facial expressions to promptly respond to them [7, 26]. This suggests that the frontal lobe, a brain region known to be responsible for executive function, may be important for identifying facial emotion recognition.
Consistent with previous results, we observed that patients with MCI experienced difficulties in accurately recognizing facial emotions compared to individuals without MCI [27‒29]. Moreira et al. [27] found that 32 patients with MCI performed poorly in two forced-choice emotion recognition tasks. Pietschnig et al. [28] reported a significant difference in facial emotion recognition in 137 patients with cognitive decline, including subjective cognitive decline, amnestic MCI, and non-amnestic MCI. Using the short form of the Vienna Emotion Recognition Task, performance deterioration was observed with an increase in the severity of cognitive decline diagnoses [28]. Sarabia-Cobo et al. [29] observed an increase in emotion recognition accuracy with an increase in the intensity of facial expressions. As elderly participants with MCI have trouble identifying more facial expressions, the authors suggest the possibility of neurological stimuli/substrate interference in processing emotion [29].
Notably, a specific brain domain may be associated with the deficits in facial emotion recognition. The data demonstrated that the inaccuracy of facial emotion recognition increased in patients with lower scores on cognitive tasks evaluating language and frontal/executive function (Fig. 1). The results of this study agree that language function is associated with facial emotion, as presented in previous studies. Francisco et al. [30] collected data related to the performance of language and recognition of facial expressions and discovered that language helps an individual build an emotion from an ambiguous sensation to a distinct expression. As executive function was discovered to be mainly associated with facial emotion recognition, these findings support existing evidence that the frontal lobe is suspected to be the brain region specifically affected by the inaccuracy of facial emotion recognition [31‒35]. The frontal lobe is associated with various emotional and social behaviors. Nakamura et al. [32] used positron emission tomography to measure regional cerebral blood flow and found that the right inferior frontal cortex was responsible for processing emotional communicative signals, including visual and auditory responses. Herberlain et al. [31] and Tsuchida and Fellows [35] observed that impairment of emotion recognition resulted in lesions in the prefrontal region, specifically in the ventromedial and ventrolateral regions. Wolf et al. [36] discovered that the ventromedial prefrontal cortex controls eye movements during facial emotion recognition. Damage to the bilateral ventromedial prefrontal cortex results in impaired visual attention to the eye regions, specifically with regard to fear [36]. Shdo et al. [37] also reported that diminishing visual attention to facial expressions and emotions results in inaccurate emotional valence perception in patients with frontotemporal dementia.
This study provides evidence of significant deficits in facial emotion recognition among older adults with MCI, utilizing a relatively large study population and neuropsychological assessment comparing diverse cognitive domains (i.e., memory, attention, executive function, and language). Prior studies have mostly focused on patients with dementia to observe the differences in facial emotion recognition compared to healthy adults. By specifically focusing on older adults with MCI, this study attempts to identify impairments in cognitive function through facial emotion recognition prior to developing more marked deficits, such as AD. However, this study had a few limitations. First, the number of participants was insufficient to represent a diverse population. Having been recruited from a single hospital, future studies should include a larger population of participants from various hospitals to generalize the results to a greater population. Second, a causal relationship could not be established because this was a cross-sectional study. Therefore, it is not clear whether MCI is the cause of facial emotion recognition inaccuracy as there could be other unrecognized factors. Third, the related brain regions could have been more accurately measured using tools such as functional magnetic resonance imaging as the results of neuropsychological tests were used to discuss the related brain regions in the present study. Lastly, it is difficult to predict whether the results of this study will be applicable in the future, as MCI remains ambiguous regarding its standard and definition for a precise diagnosis. Therefore, future studies must be carefully designed to include the proposed suggestions to acquire a better understanding of MCI for the prevention of severe cognitive decline in the elderly population.
Conclusion
We found a significant association between the deficits in facial emotion recognition and cognitive impairment in the sample population of elderly individuals. Inaccuracy of facial emotion recognition increased in participants with decreased executive function performance, which is related to the frontal cortex. This suggests that patients with MCI experience difficulties in facial emotion identification, and that frontal lobe dysfunction (corresponding to executive function) may be involved in impaired facial emotion identification. Although future studies are required to further investigate this association in the general population, the results can be used as a tool to identify elderly individuals with cognitive impairment in an efficient manner. With the duration of neuropsychological assessment lasting for more than an hour, facial emotion recognition has been suggested as a means of distinguishing individuals with cognitive impairment before they develop severe stages of dementia.
Statement of Ethics
All methods were carried out in accordance with relevant guidelines and regulations including the Declaration of Helsinki. Study protocols were approved by the Institutional Ethics Review Board of the Veterans Health Service Medical Center (IRB No. BOHUN 2021-02-024, BOHUN 2021-01-066). All the participants have given their informed consent to participate in the study. We obtained written informed consent for participation in this study.
Conflict of Interest Statement
The authors declare that they have no competing interests.
Funding Sources
This research was supported by the Korea Health Technology R&D Project through the Korea Health Industry Development Institute (KHIDI) and Korea Dementia Research Center (KDRC), funded by the Ministry of Health and Welfare and Ministry of Science and ICT, South Korea (Grant No. HU20C0487). This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (Grant No. 2022R1A2C2010463, RS-2024-00338688). This work was supported by VHS Medical Center Research Grant, South Korea (VHSMC 22026). This work was supported by the Education and Research Encouragement Fund of Seoul National University Hospital.
Author Contributions
J.Y. Min and K.B. Min supervised the study. E.Y. Ju, J.Y. Min, and K.B. Min designed the study. E.Y. Ju, C.Y. Kim, B.Y. Choi, and S.W. Ryoo collected the data. E.Y. Ju and C.Y. Kim analyzed and interpreted of data. B.Y. Choi and S.W. Ryoo reviewed the literature. E.Y. Ju, C.Y. Kim, and J.Y. Min drafted and revised the manuscript. All authors read and approved the final manuscript.
Data Availability Statement
The datasets generated and analyzed during the current study are not publicly available due to institutional restrictions but are available from the corresponding author on reasonable request.