Research Article | Open Access
Zhongyang He, Kai Yang, Ning Zhuang, Ying Zeng, "Processing of Affective Pictures: A Study Based on Functional Connectivity Network in the Cerebral Cortex", Computational Intelligence and Neuroscience, vol. 2021, Article ID 5582666, 11 pages, 2021. https://doi.org/10.1155/2021/5582666
Processing of Affective Pictures: A Study Based on Functional Connectivity Network in the Cerebral Cortex
Emotion plays an important role in people’s life. However, the existing researches do not give a unified conclusion on the brain function network under different emotional states. In this study, pictures from the international affective picture system (IAPS) of different valences were presented to subjects with a fixed frequency blinking frequency to induce stable state visual evoked potential (SSVEP). With the source location method, the cerebral cortex source signal was reconstructed based on EEG signals, and then the difference in SSVEP amplitudes in key brain areas under different emotional states and the difference in brain function network connections among different brain areas were analysed in cortical space. The results of the study show that positive and negative emotions evoked greater activation intensities in the prefrontal, temporal, and parietal lobes compared with those of neutral emotion. The network connections with a significant difference between emotional states mainly appear in the alpha and gamma bands, and the network connections with significant differences between positive emotion and negative emotion mainly exist in the right middle temporal gyrus and the superior frontal gyrus on both sides. In addition, the long-range connections play an important role in the process of emotional processing, especially the connections among frontal gyrus, middle temporal gyrus, and middle occipital gyrus. The results of this study provide a reliable scientific basis for revealing and elucidating the neural mechanism of emotion processing and the selection of brain regions and frequency bands in emotion recognition based on EEG signals.
Emotions are states that combine the physical and mental activities that each of us will produce. Changes in the emotional state will affect our work and life. Positive emotional states are beneficial to improve our happiness in life, and negative emotions can affect our work efficiency and even cause some serious consequences. For example, the negative emotions of fighter pilots may cause plane crashes. Therefore, studying various emotional states and emotional processing processes and understanding the neural basis behind them are of great significance for understanding emotions and effectively identifying and regulating emotions. In recent years, with the development of brain imaging technology, increasing tools provided scientists with a means of directly observing brain activity in emotional states, as well as effective measures for revealing the mechanism of emotion. The current methods used to measure emotional brain activity mainly include fMRI, near-infrared functional imaging, brain magnetism, and EEG. Among them, EEG has become the mainstream method of current emotional brain activity research with its advantages of high time resolution, portability, and low price.
The human brain is one of the most complex systems in nature, and functional differentiation and functional integration are the two organizational principles of human brain function . Scientists have discovered that different areas of the brain dominate different functions, but the brain also needs the coordination of multiple brain areas to complete a very simple task. Emotional activities involve high-level cognitive functions such as memory, cognition, and thinking. The execution of emotional activities is a complex task for the brain. Therefore, the completion of emotional activities requires the interaction and coordination of multiple brain regions, and multiple brain regions will form a complex brain network to process emotions. Therefore, more and more researchers have begun to study emotional activities from the perspective of brain network. Costa et al. used emotional EEG data to construct a phase synchronization brain network. They found that the phase synchronization of the whole brain generally increases in negative emotional states, while phase synchronization occurs between the frontal and occipital lobes in positive emotional states . In 2017, Zhang et al. used granger causality to construct a brain network based on positive and negative emotional EEG signals. They found that the interaction between the prefrontal, parietal, and occipital regions of the brain in the negative emotional state was greater than that in the positive emotional state and the parietal brain area responsible for the human alert mechanism becomes more active in a negative state . In 2018, Wang et al. based on functional magnetic resonance to study the brain functional connectivity of schizophrenia patients during facial emotion cognitive processing, and the results showed that severe schizophrenia patients have amygdala and medial prefrontal cortex in a state of fear and happiness. The functional connectivity between the dorsal anterior cingulate gyrus and the cortex decreased . In 2018, Wyczesany et al. constructed a brain network based on the EEG signals induced by positive and negative emotional face pictures. The results showed that there are strong connections between the frontal lobe, attention network (prefrontal lobe and parietal inner groove), temporal lobe, and occipital lobe in emotional state regions, and the connections of these brain regions are mainly existing in the right hemisphere .
In addition, existing studies have shown that frequency band rhythm characteristics are closely related to emotional activity, and many researchers have discovered the characteristics of network connections in different frequency bands. In 2019, Li et al. explored networks with significant differences between positive, neutral, and negative emotions in four frequency bands. They found that network connections mainly exist in the beta and gamma bands, and the network connections in the beta band mainly exist in the forehead, parietal lobe, occipital lobe, and network connections between the temporal lobe and occipital lobe in the gamma band . In 2020, Zheng and Lu studied the network characteristics of various emotional states in five frequency bands. They found that the strong positive network connection in the disgusting emotional state mainly appeared in the gamma frequency band, while the positive network connections in the fear state mainly appeared in the theta frequency band . The above research shows that analysing the characteristics of network connections in different frequency bands plays an important role in revealing the neural mechanism of emotions.
The development of source imaging technology enables the use of EEG recorded by scalp sensors to trace the neural activity in a high-resolution cortical space, thereby realizing real-time imaging of the whole brain neural network . Wheelock found that, under unpredictable threats, dorsal medial PFC is the nerve centre that affects the activities of other brain regions. On the contrary, when the threat is predictable, the dorsolateral PFC is a nerve centre that affects the activities of other brain regions . Ramirez-Mahaluf et al. found that the left dorsolateral prefrontal cortex and the left medial frontal pole are the regulatory regions for the interaction between sadness and cognitive networks . Teckentrup et al. found that the anterior insula cortex can perform different prefrontal network scheduling in the emotional image processing process according to expectations . Keuper et al. studied the cortical response behind emotional word processing and found that, within the P1 time window, the emotional effect peaked in the left middle temporal gyrus . Becker et al. applied source localization techniques to reconstruct the EEG activity on the cortical surface to analyse valence (positive or negative emotions) and they found that source reconstruction can improve the classification results . Cortical spatial brain network connection differences and cortical source response differences are of great significance to better understand the processing of different emotions and improve the accuracy of emotion recognition based on EEG signals. In this work, with the help of stable state visual evoked potential’s (SSVEP) good time resolution (time-locked and phase-locked characteristics) and high signal-to-noise ratio, we utilized the source location method to reconstruct the cerebral cortex signal . Then, using source response analysis, SSVEP amplitude analysis, and phase lock value (PLV) functional network connection analysis in cortical space, the neural differences in the processing of different types of emotional pictures are studied from the local and global perspective. This research may provide new evidence for revealing and clarifying the neural mechanism of emotion processing.
2. Materials and Methods
2.1. The Data Preprocessing
In this study, we used the emotional SSVEP dataset as in our previous work and details about this dataset can be found in our previous study . The dataset consisted of 61-channel EEG signals collected from 20 healthy subjects while they were watching positive, neutral, negative, scrambled pictures flickering at a rate of 10 Hz.
The main preprocessing procedures are as follows:(i)Data extraction: the valid data segment is set to −500–2000 ms, in which the length of time before stimulation is 500 ms, and the length of time after stimulation is 2000 ms(ii)Bad channel average: check whether there is a damaged channel for which EEG data has not been collected, and replace the data of the damaged channel with the data of adjacent channels on average(iii)Rereference: the data is rereferenced according to the reference electrode standardization technique (REST) proposed by Yao et al. (iv)Signal filtering: the signal is filtered by 0.1–64 Hz bandpass(v)Artifacts removal: set the threshold ±150 uV to eliminate all data segments with amplitude greater than 150 uV(vi)Baseline correction: baseline correction was performed based on 500 ms before stimulation
In addition, it should be noted that, in the artifact removal step, this chapter does not use the ICA artifact removal method. Instead, referring to the method in the literature, the threshold is set to 150 uV to remove the artifacts with a large amplitude to ensure the original EEG data quality.
2.2. Cortical Source Response Analysis
To study the neural differences in the processing of different types of emotional pictures under the conditions of high temporal resolution and high spatial resolution, we use the source localization method to reconstruct the cerebral cortex signals. Then in cortical space, we use source response analysis, SSVEP amplitude analysis, and PLV functional network connection analysis to study the emotional processing mechanism. Source response analysis and SSVEP amplitude analysis can finely explore the activation intensity of local brain areas under different emotional states, while functional network connection difference analysis can show the global interaction of different brain areas. Through the combination of the two analysis methods, the neural mechanisms of different emotional processing will be explored from local and global views.
In the research of this paper, the source reconstruction is carried out with Brainstorm software , which is available for free online download under the GNU with general public license (https://neuroimage.usc.edu/brainstorm). Brainstorm software can realize the source imaging of EEG and MEG signals and provides rich time-frequency analysis, connection analysis, and other functions. When using Brainstorm to perform EEG source imaging in this chapter, the anatomical structure of the brain uses the standard brain template ICBM152, the forward model uses the three-layer head model (brain, skull, and scalp) provided by OpenMEEG , which is based on boundary elements model (BEM) method , and the inverse model uses the minimum norm estimation (MNE) algorithm for source imaging .
Source imaging was performed on EEG signals when each subject was viewing positive, neutral, negative, and phase random pictures to achieve cortical signal source reconstruction. Then the source response intensity of multiple subjects was averaged for further analysis; the following are the specific analysis steps.
Suppose the subject is under emotional condition , the EEG signal of the trial is , 61 is the number of EEG channels, and 1024 is the number of discrete sampling points for each channel (sampling rate 512 Hz; signal length 2 s), where is the subject index, , ; represents the four emotional conditions, happy, neutral, negative, and random in phase; is the trial index; ; and is the number of trials remaining after preprocessing. The number of trails remaining after processing is inconsistent, so no specific value is given, so use instead.
Analysis for each participant is as follows:(1)Source imaging was performed on to obtain . In the source imaging operation, the entire cortex is divided into 15,000 dipoles, each trial is mapped to signal on the cerebral cortex, and the signal’s dimension is .(2) source imaging results of each trial were averaged to obtain the average source response of subject under emotional conditions :(3)Z-score standardization processing was performed on the average source response of the subject under the experimental condition to obtain .(4)Calculate the absolute value of .(5)For each subject , the source response value of neutral emotion is selected as the reference, and the source response values of positive emotion, negative emotion, and phase in random will subtract the value of neutral emotion as follows: Group analysis is as follows:(6)The source response differences of all subjects were averaged to get :where is the participant index and .
reflects the difference in the magnitude of the source response under the experimental condition and the neutral condition. When , it means that the average source response under emotion is greater than the average source response under neutral emotion; that is, the average source response of emotion is stronger; when , it means that the magnitude of the average source response under emotion is smaller than that of neutral emotion; that is, the average source response under neutral is stronger.
Neutral emotion was used as a reference, and the differences between the source response values of positive/negative pictures and neutral pictures represent the emotional factor. The differences between the source response values of random phase pictures and neutral pictures represent the influence of semantic factors.
2.3. Analysis of SSVEP Amplitude in Cortical Space
Analysing the amplitude of SSVEP in the cortical space can reveal the response strength of different brain areas at the reference frequency 10 Hz and mark the key brain areas involved in emotional processing. We selected 10 Regions of Interest (ROIs) from the cortical space to analyse the SSVEP amplitude. The selection of these ROIs is based on the results of the research on emotion in the literature. Riedel et al. analysed brain activation maps in 1747 experiments, and 5 emotional-related meta-analysis groups were obtained from the entire brain space, that is, primary auditory cortex; insula, anterior cingulate gyrus, and subcortical region; medial prefrontal lobe and posterior cingulate cortex; amygdala and fusiform gyrus . Therefore, we selected 10 ROIs in the cortical PFC, temporal lobe, parietal lobe, and occipital lobe, which are in the cortical PFC, temporal lobe, parietal lobe, and occipital lobe. The emotion-related areas in the cerebral cortex given by Riedel et al. are all included, and each area contains 260 cortical source signals.
Table 1 shows the coordinates of the centre positions of these ROIs in the Montreal Neurological Institute (MNI) space. The centre positions are symmetrical about the left and right hemispheres. In the table, LO, LP, LT, LMF, and LSF represent the middle occipital gyrus, anterior parietal gyrus, middle temporal gyrus, middle frontal gyrus, and superior frontal gyrus of the left hemisphere, and RO, RP, RT, RMF, and RSF indicate the middle occipital gyrus, anterior parietal gyrus, middle temporal gyrus, middle frontal gyrus, and superior frontal gyrus of the right hemisphere, respectively.
The specific analysis steps include the analysis of a single subject and the group analysis of multiple subjects. MOG, APG, MTG, MFG, and SFG represent the middle occipital gyrus, anterior parietal gyrus, middle temporal gyrus, middle frontal gyrus, and superior frontal gyrus, respectively.
The first is the analysis for each subject:(1)Signal extraction: for subject under emotional condition , extract the cortical signal of the trial in the ROI , where is the subject index, , represents the four emotional conditions, happy, neutral, negative, and random phase, is the index of the ROI, , is the index of the trial, , is the number of trials remaining after preprocessing, and is the total number of signal sources in the ROI, . is the length of the first signal in the ROI (sampling rate 512 Hz; signal length 2 s).(2)Amplitude calculation: Fourier transform is performed on the signal of each segment, and then the amplitude of the signal at 10 Hz is calculated.(3)Averaging of signals in ROI: signals in ROI were averaged to obtain the average amplitude of trial in the ROI of the subject under emotional condition . is the number of signals in the ROI, .(4)Average of trials: the phases of trials were averaged to get the average amplitude of the first ROI of the subject under emotional condition . In formula (5), is the number of trials remaining after the subjects are pretreated under the emotional condition .(5)Normalization: taking the SSVEP amplitude of each ROI under neutral conditions as a reference, the amplitude of each ROI under the emotional condition of the subject is divided by :(6)Group average: taking the average of the normalized results of all subjects to get the normalized range of the ROI under emotional condition ,where is the participant index and .
In the above process, the normalization is placed before the group averaging, which considers the possible magnitude of the individual differences of each subject. Therefore, the analysis results of each subject are normalized before the group averaging analysis.
2.4. Analysis of Cortical Space Function Connection Network
In neuroscience research, the phenomenon of intersignal synchronization is a key feature of information exchange between different regions. Phase synchronization is a commonly used method for constructing undirected networks . The PLV is proposed by Lachaux  and PLV calculates the instantaneous phase difference in two signals within a certain narrowband frequency. PLV can quantify the degree of synchronization of two neural signals in a specific frequency band and time zone into a phase-locked state. The larger the PLV value, the stronger the synchronization coupling of the two signals. For two time series and , the calculation process of PLV is as following [23, 24]:where represents the sampling period, represents the number of sampling points, and represent the instantaneous phase of the signal and , respectively, and represents the phase difference.
According to the division of commonly used frequency bands and rhythms of EEG, the PLV network was calculated in five frequency bands: delta (1–4 Hz), theta (4–8 Hz), alpha (8–12 Hz), beta (12–30 Hz), and gamma (30–64 Hz). The PLV network differences between ROIs under different emotional states are studied by statistical methods. The analysis steps include intrasubject analysis and cross-subject group analysis.
Under the emotional condition of the subject , the cortical space signal of the trial is , where is the subject index , ; represents the four emotional conditions, happy, neutral, negative, and random phase; is the index of the trial, , is the number of trials remaining after preprocessing, and the dimension of is .
Analysis for each subject is as follows:(1)Signal extraction: for the subject under emotional condition , extract the cortical signal of the trial in the ROI , where is the subject index, , represents the four emotional conditions, happy, neutral, negative, and random phase, is the index of the ROI, , is the index of the trial, , is the number of trials remaining after preprocessing, is the total number of signal sources in the ROI, , is the signal in the ROI, , and the length of is .(2)Principal component analysis: there are two options in Brainstorm “Average” and “PCA.” In order to retain useful component to the greatest extent, we utilize “PCA” (principal component analysis) to extract the largest principal component from , as the representation information of ROI [25, 26]. For the subject under emotional condition , the largest principal component of 10 ROI form a matrix under the trial compose matrix is . The dimension of is . Then the PLV connection relationships between the two ROI regions were analysed on .(3)Calculating the PLV connection matrices: the PLV connection matrix was calculated based on ; is a network connection matrix. Each value in the network matrix represents the PLV between two ROIs.(4)Average of trials: the connection matrix corresponding to multiple trials were averaged to get the average connection matrix of the subject under emotional condition . Group analysis of multiple subjects is as follows:(5)Statistical analysis: paired t-test was performed on the PLV connection matrix of all subjects under different emotional conditions to analyse whether there are significant differences in the connections between ROIs under different emotional conditions [6, 27].
3.1. Differences of Cortical Source Response
We carried out an analysis of the difference between the source response intensities of the four cases, that is, positive-neutral, negative-neutral, positive-negative, and random-neutral.
Figure 1 shows the differences in response intensity between positive and neutral in 0 s to 2 s. The figure shows the differences in amplitude of positive emotion minus that of neutral emotion. As shown by the colour bar, yellow and red represent the amplitude of positive emotion greater than that of neutral emotion. Blue means that the amplitude of positive emotion is less than that of neutral emotion. It can be seen from the figure that, within 0.5 s to 1 s, the activation degree of neutral emotion in the left parietal lobe is greater than that of positive emotion; within 1.5 s to 2 s, the activation degree of positive emotion is greater than that of neutral emotion in the left frontal lobe, right temporal lobe, and double lateral parietal lobes.
Figure 2 shows the difference in response intensity between negative emotion and neutral emotion in 0 s to 2 s. The figure shows the difference in amplitude of negative emotion minus that of neutral emotion. It can be seen from the figure that, at 0.5 s, the activation degree of neutral emotion in the left and right prefrontal lobes is greater than that of negative emotion; from 1 s, the activation degree of negative emotion gradually increases in the left hemisphere temporal lobe (1 s), parietal lobe (1.5 s), and the prefrontal lobe (2 s); the intensity is greater than that of neutral emotion; from 1 s, the response of negative emotion gradually increases in the right hemisphere, and the intensity of negative emotion in temporal lobe response is greater than that of neutral emotion.
Figure 3 shows the difference in response intensity between negative emotion and positive emotion in 0 s to 2 s. The figure shows the difference in magnitude of the negative emotion minus that of positive emotion. It can be seen from the figure that, from 0.5 s to 1 s, the degree of activation of the negative emotion in the frontal lobe on both sides is lower than that of the positive emotion; starting from 1 s, the intensities of negative emotion in the temporal lobe (1 s), parietal lobe (1.5 s), and prefrontal lobe (2 s) are greater than that of the neutral emotion; from 1 s, the intensity of negative emotion in the temporal lobe of right hemisphere enhanced.
Figure 4 shows the difference in response intensity between phase random picture evoked emotion and neutral emotion in 0 s to 2 s. The figure shows the difference in magnitude of t random phase picture evoked emotion minus that of neutral emotion. It can be seen from the figure that, from 0.5 s to 1 s, the degree of response intensities of neutral emotion in the occipital, prefrontal, and temporal lobes on the left and right sides is greater than that of random phase picture evoked emotion; starting from 1.5 s, the response intensities of the parietal lobe in left and right hemispheres gradually enhanced and are greater than those of neutral emotion.
3.2. Difference in SSVEP Amplitude
Taking SSVEP amplitude of neutral emotion as a reference, the amplitudes of each ROI under the positive emotion, negative emotion, and phase random picture evoked emotion were divided by the amplitude of each ROI area under the neutral condition, respectively. Table 2 shows the normalized amplitude of the 10 ROI regions under the positive emotion, negative emotion, and phase random picture evoked emotion on the cortex. It can be seen from the table that (1) the amplitude of positive emotion on the left frontal lobe is higher than that on the right, while the amplitude of negative emotion on the right frontal lobe is higher than that of the left frontal lobe, (2) negative emotion has the largest amplitude in the occipital, temporal, and middle frontal lobes on both sides, and the activation intensity of the ventral pathway is the largest, and (3) except for the anterior central gyrus of the right parietal lobe, the SSVEP amplitudes of phase random picture evoked emotion in the other brain areas are all smaller than those of neutral emotions.
3.3. Difference in Functional Network Connection
Table 3 shows the PLV connections with significant differences between positive emotion and neutral emotion. There are more differential connections in the alpha and gamma bands. The PLV connections show that key brain areas are RP, RT, RMF, and LMF. Figure 5 shows the PLV connections with a significant difference between positive emotion and neutral emotion in the gamma band, and the PLV connection values under positive emotion are smaller than those of neutral emotion.
Table 4 shows the PLV connections with a significant difference between negative emotion and neutral emotion. The PLV connections with a significant difference mainly exist in alpha and gamma bands. The PLV connections show that the key brain areas are RMF, LSF, RSF, and RO. Figure 6 shows the PLV connections with a significant difference between negative and neutral emotions in the gamma band.
Table 5 shows that PLV connections with significant differences between positive emotion and negative emotion PLV connections with significant differences between positive and negative emotions mainly exist in the alpha, beta, and gamma bands, and there are more connections with significant differences in the alpha band. The alpha (8–12 Hz) wave represents the state of relaxation. With the increase of mental load, the energy of the alpha band will decrease, which is obvious in the temporal lobe and occipital lobe, and there will be energy changes in the frontal lobe under complex tasks. The PLV connections show that the key brain areas are RO, LMF, and RSF. PLV connections with a significant difference between negative emotion and positive emotion in the alpha band are shown in Figure 7.
Table 6 shows the PLV connections with a significant difference between phase random and neutral emotion in each frequency band. The PLV connections with a significant difference mainly exist in delta and gamma frequency bands. The key brain areas are LO, RT, RSF, RT, and LT. Figure 8 shows PLV connections with a significant difference between phase random and neutral emotion in the delta band.
Based on the EEG source location method, this study explores the dynamic response of the cerebral cortex, the amplitude difference in different emotions at 10 Hz frequency, and the emotional-related brain functional network connection patterns in the process of affect pictures. The neural mechanism of emotion was analysed from the response in local brain regions and differences in network connections between multiple brain regions.
4.1. The Difference in the Source Response of Different Emotions
Through the analysis of the difference in the source response of different emotions, we found that the response intensity of neutral emotion is greater than that of positive and negative emotions within 1 s after the stimulus was displayed. After 1 s, the activation intensities of positive and negative emotions are greater than those of neutral emotion in the temporal lobe, parietal lobe, and prefrontal brain areas. A comparison of the source responses of positive and negative emotions shows that the response intensities of positive emotion in the frontal lobes on both sides are greater than those of negative emotion. After 1 s, the response intensities of negative emotion are greater than those of positive emotion in the temporal lobe, parietal lobe, and prefrontal lobe. It can be found that the difference in how the brain processes different emotional stimuli is mainly in the prefrontal brain area. According to previous studies, the prefrontal lobe is a brain area related to high-level cognitive functions. Many studies have also found that the prefrontal lobe plays an important role in emotional processing. Zhuang et al. found that the channels that play an important role in emotion recognition mainly come from the prefrontal, temporal, and occipital brain regions . Lin et al. studied the behavioural and neurological effects of members within and outside the group on individual emotional processing. They used fMRI to observe the ventral striatum and peritoneal prefrontal cortex, and the prefrontal cortex of the rucksack, the medial prefrontal cortex, the supratemporal posterior sulcus, and other brain regions are activated in relation to rewards and positive valence . Perry et al. studied the impact of prefrontal cortex damage on emotional understanding. They found that the accuracy and response time of emotional understanding of patients with prefrontal cortex damage are worse than those of normal people . Their research has once again confirmed that the prefrontal cortex plays an important role in understanding the emotion.
4.2. The Analysis of Functional Network Connection
In the functional network connection analysis, we found that network connections with significant differences between different emotional states mainly exist in the alpha and gamma frequency bands. The alpha band is related to the relaxed state, and with the cognitive load of the brain increasing in the emotional state, the alpha band will have a corresponding energy change. Arndt et al. used EEG signals to evaluate the effect of the quality of the speech-to-text system on users’ emotion. They found that when the subjects faced lower quality synthetic text, the neuron activity in the alpha band of the left frontal lobe increased . However, Schubring et al. found that when subjects dealt with high arousal emotional stimuli, the energy in the alpha band decreased . In short, energy changes in the alpha band are closely related to emotional stimulation processing. Many previous studies have pointed out that high-frequency EEG plays an important role in high-level cognitive activities such as memory, decision-making, and emotion. Li et al. studied the differences in network connections between emotions in different frequency bands, and they found that there are most network connections with a significant difference in the gamma band . Matsumoto found that, in the 400–450 millisecond time window after people receive negative emotional stimulation, the EEG signal will show higher gamma-band power and phase synchronization. They concluded that gamma-band activity is related to emotional processing . In addition, in the analysis of network connections between two different emotion states in the alpha and gamma bands, we observed that there are many long-range connections among multibrain regions. According to previous studies, long-range connections are related to the functional integration of the brain under complex cognitive tasks such as emotion and behaviour control. Moreover, long-range connections mainly appear in the superior frontal gyrus, middle frontal gyrus, middle temporal gyrus, superior parietal gyrus, and middle occipital gyrus. The network connections of these brain regions contribute to emotional processing and information integration.
4.3. The Difference in Brain Response between Neutral and Phase Random Stimulus
The results of the difference in source response between neutral and phase random stimulus showed that, in the time period between 0.5 s and 1 s, the degree of activation of the occipital, prefrontal, and temporal lobes on the left and right sides of the neutral emotion was greater than that of phase random picture evoked emotion. Song et al. studied the difference between text and picture integration process; they found that in the early period of picture processing a larger negative N1 was evoked in the occipital region, and a negative value of N300 was evoked in the prefrontal area, which may reflect the identification process of visual stimuli and the image representation of the picture . After 1.5 s of stimulation, the EEG source response intensities induced by phase random stimulation in the anterior parietal lobe of the left and right hemispheres gradually enhanced and were greater than those of neutral emotion. A comparison of the differences between brain network connections of neutral emotion and phase random picture evoked emotion showed that the differences of network connections mainly appeared in the bilateral middle temporal gyrus, right superior frontal gyrus, and left superior parietal gyrus in the delta and gamma bands. The difference between neutral pictures and phase random pictures is that neutral pictures have semantic information, and the processing of neutral pictures involves semantic information processing, while the processing of phase random pictures does not require semantic processing. Shan et al. found that the relevant brain areas for Chinese semantic processing are the left lower frontal gyrus, the left posterior inferior temporal gyrus, the joint area of the lower parietal lobe, and the superior temporal gyrus . In the study by Song et al., they found that a late positive component was induced in the central, parietal, and temporal regions, which they believed might be related to the semantic activation and integration of text and pictures, respectively .
Based on SSVEP-induced emotional EEG signals, this paper uses the EEG source location method to explore the cognitive processing of the brain under emotional states from two aspects: the local brain area source response and the global brain function network connection. The results of local brain area source response show that positive and negative emotions show greater activation intensity in the temporal lobe, parietal lobe, and prefrontal lobe compared with neutral emotion. In the early stage of image processing, the higher activation intensities of the left and right occipital, prefrontal, and temporal lobe related to semantic information processing. The network connections with significant differences between positive emotion and negative emotion mainly exist in the right middle temporal gyrus and the superior frontal gyrus on both sides. The long-range connections play an important role in the process of emotional processing, especially, the connections among frontal gyrus, middle temporal gyrus, and middle occipital gyrus. In addition, both local source response and network connections in alpha and gamma frequency bands can better discriminate different emotions. The work of this paper expands the research on emotional picture processing and provides new support for the relevant researches.
The data are available upon request from the corresponding author, email@example.com.
Conflicts of Interest
The authors declare that there are no conflicts of interest regarding the publication of this paper.
- T. Z. Jiang, Y. Liu, and L. I. Yong-Hui, “Brain networks: from anatomy to dynamics,” Chinese Bulletin of Life Sciences, vol. 21, no. 2, pp. 181–188, 2009.
- T. Costa, E. Rognoni, and D. Galati, “EEG phase synchronization during emotional response to positive and negative film stimuli,” Neuroscience Letters, vol. 406, no. 3, pp. 159–164, 2006.
- J. Zhang, S. Zhao, W. Huang, and S. Hu, “Brain effective connectivity analysis from EEG for positive and negative emotion,” in Proceedings of the International Conference on Neural Information Processing, pp. 851–857, Guangzhou, China, November 2017.
- Y. Wang, Z. Li, and W. Liu, “Negative schizotypy and altered functional connectivity during facial emotion processing,” Schizophrenia Bulletin, vol. 44, no. 2, pp. 1–10, 2018.
- M. Wyczesany, P. Capotasto, F. Zappasodi, and G. Prete, “Hemispheric asymmetries and emotions: evidence from effective connectivity,” Neuropsychologia, vol. 121, pp. 98–105, 2018.
- P. Li, H. Liu, Y. Si et al., “EEG based emotion recognition by combining functional connectivity network and local activations,” IEEE Transactions on Biomedical Engineering, vol. 66, no. 10, pp. 2869–2881, 2019.
- W. L. Zheng and B. L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Transactions on Autonomous Mental Development, vol. 7, no. 3, pp. 162–175, 2015.
- A. C. Evans, D. L. Collins, S. R. Mills, E. D. Brown, and T. M. Peters, “3D statistical neuroanatomical models from 305 MRI volumes,” in Proceedings of the 1993 IEEE Conference Record Nuclear Science Symposium and Medical Imaging Conference, pp. 1813–1817, San Francisco, CA, USA, November 1993.
- M. D. Wheelock, K. R. Sreenivasan, K. H. Wood, L. W. Ver Hoef, G. Deshpande, and D. C. Knight, “Threat-related learning relies on distinct dorsal prefrontal cortex network connectivity,” Neuroimage, vol. 102, pp. 904–912, 2014.
- J. P. Ramirez-Mahaluf, J. Perramon, B. Otal, P. Villoslada, and A. Compte, “Author Correction: subgenual anterior cingulate cortex controls sadness-induced modulations of cognitive and emotional network hubs,” Scientific Reports, vol. 8, no. 1, Article ID 11237, 2018.
- V. Teckentrup, J. N. v. d. Meer, V. Borchardt et al., “The anterior insula channels prefrontal expectancy signals during affective processing,” Neuroimage, vol. 200, pp. 414–424, 2019.
- K. Keuper, P. Zwanzger, M. Nordt et al., “How ‘love’ and ‘hate’ differ from ‘sleep’: using combined electro/magnetoencephalographic data to reveal the sources of early cortical responses to emotional words,” Human Brain Mapping, vol. 35, no. 3, pp. 875–888, 2014.
- H. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources,” IEEE Transactions on Affective Computing, vol. 11, no. 2, pp. 244–257, 2020.
- M. Zhu, E. Alonso-Prieto, T. Handy, and J. Barton, “The brain frequency tuning function for facial emotion discrimination: an SSVEP study,” Journal of Vision, vol. 16, no. 6, pp. 12–14, 2016.
- K. Yang, Y. Zeng, L. Tong, B. Liu, and B. Yan, “Study on temporal and spatial patterns of brain in emotional state based on steady state visual evoked potentials,” Journal of Physics: Conference Series, vol. 1187, no. 4, 2019.
- D. Yao, Y. Qin, S. Hu, L. Dong, M. L. Bringas Vega, and P. A. Valdés Sosa, “Which reference should we use for EEG and ERP practice?” Brain Topography, vol. 32, no. 4, pp. 530–549, 2019.
- F. Tadel, S. Baillet, J. C. Mosher, D. Pantazis, and R. M. Leahy, “Brainstorm: a user-friendly application for MEG/EEG analysis,” Computational Intelligence and Neuroscience, vol. 2011, Article ID 879716, p. 13, 2011.
- A. Gramfort, T. Papadopoulo, E. Olivi, and M. Clerc, “OpenMEEG: opensource software for quasistatic bioelectromagnetics,” Biomedical Engineering Online, vol. 9, no. 1, pp. 45–20, 2010.
- J. Kybic, M. Clerc, T. Abboud, O. Faugeras, R. Keriven, and T. Papadopoulo, “A common formalism for the Integral formulations of the forward EEG problem,” IEEE Transactions on Medical Imaging, vol. 24, no. 1, pp. 12–28, 2005.
- S. Baillet, J. C. Mosher, and R. M. Leahy, “Electromagnetic brain mapping,” IEEE Signal Processing Magazine, vol. 18, no. 6, pp. 14–30, 2001.
- M. C. Riedel, J. A. Yanes, K. L. Ray et al., “Dissociable meta‐analytic brain networks contribute to coordinated emotional processing,” Human Brain Mapping, vol. 39, no. 6, pp. 2514–2531, 2018.
- V. Sakkalis, “Review of advanced techniques for the estimation of brain connectivity measured with EEG/MEG,” Computers in Biology and Medicine, vol. 41, no. 12, pp. 1110–1117, 2011.
- J.-P. Lachaux, E. Rodriguez, J. Martinerie, and F. J. Varela, “Measuring phase synchrony in brain signals,” Human Brain Mapping, vol. 8, no. 4, pp. 194–208, 1999.
- J. M. Hurtado, L. L. Rubchinsky, and K. A. Sigvardt, “Statistical method for detection of phase-locking episodes in neural oscillations,” Journal of Neurophysiology, vol. 91, no. 4, pp. 1883–1898, 2004.
- R. Sanz-Requena, J. M. Prats-Montalbán, L. Martí-Bonmatí et al., “Automatic individual arterial input functions calculated from PCA outperform manual and population‐averaged approaches for the pharmacokinetic modeling of DCE‐MR images,” Journal of Magnetic Resonance Imaging, vol. 42, no. 2, pp. 477–487, 2014.
- J. Shlens, “A tutorial on principal component analysis,” International Journal of Remote Sensing, vol. 51, no. 2, pp. 1–12, 2014.
- P. Diehr, D. C. Martin, T. Koepsell, and A. Cheadle, “Breaking the matches in a paired t-test for community interventions when the number of pairs is small,” Statistics in Medicine, vol. 14, no. 13, pp. 1491–1504, 1995.
- N. Zhuang, Y. Zeng, K. Yang, C. Zhang, L. Tong, and B. Yan, “Investigating patterns for self-induced emotion recognition from EEG signals,” Sensors, vol. 18, no. 3, pp. 841–862, 2018.
- L. C. Lin, Y. Qu, and E. H. Telzer, “Intergroup social influence on emotion processing in the brain,” Proceedings of the National Academy of Sciences of the United States of America, vol. 115, no. 42, pp. 10630–10635, 2018.
- A. Perry, S. N. Saunders, J. Stiso et al., “Effects of prefrontal cortex damage on emotion understanding: EEG and behavioural evidence,” Brain, vol. 140, no. 4, pp. 1086–1099, 2017.
- S. Arndt, J.-N. Antons, R. Gupta et al., “The effects of text-to-speech system quality on emotional states and frontal alpha band power,” in Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, November 2013.
- D. Schubring and H. T. Schupp, “Emotion and brain oscillations: high arousal is associated with decreases in alpha- and lower beta-band power,” Cerebral Cortex, vol. 31, no. 3, pp. 1597–1608, 2020.
- A. Matsumoto, Y. Ichikawa, N. Kanayama, H. Ohira, and T. Iidaka, “Gamma band activity and its synchronization reflect the dysfunctional emotional processing in alexithymic persons,” Psychophysiology, vol. 43, no. 6, pp. 533–540, 2006.
- S. Li, S. Chen, H. Zhang, Q. Zhao, and J. Hong, “Dynamic cognitive processes of text-picture integration revealed by event-related potentials,” Brain Research, vol. 1726, Article ID 146513, 2019.
- B. Shan, “Chinese semantic processing cerebral areas,” Chinese Science Bulletin, vol. 48, no. 23, pp. 2607–2610, 2003.
Copyright © 2021 Zhongyang He et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.