|
|
Study on the Differences of EarlyMid ERP Components Induced by Scene Situation and Face Expression Images |
Institute of Biomedical Engineering, Chinese Academy of Medical Sciences & Peking Union Medical College, Tianjin 300192, China |
|
|
Abstract Researches about the cerebral processing mechanism of emotions have important scientific significance and application value. To probe different cortical processing mechanisms of emotions induced by scene situation and face expression images, 16 graduate students (7 males, average age 27±3) were organized in the experiments. The electroencephalogram was recorded when subjects viewing scene situation and face expression images (separately divided in three types of positive, negative and neutral), as well as experiencing and judging emotions. Earlymid occipital ERP (Event Related Potentials) components’ amplitudes under different emotional images were compared, and RMS (Root Mean Square) was calculated to analyze activation conditions of the whole brain. Results show that the N1 (170 ms) amplitudes induced by face images were larger than that of scene images and P2 (250 ms) amplitudes induced by face images were less than that of scene images, embodying the specificity of face processing and reprocessing mechanism of complex scenes. Negative scene images were processed preferentially and induced more obvious N1 components than positive and neutral scene images, while emotional images could be reprocessed more than the neutral images in the moment of P2. In addition, ERP amplitude comparisons among the whole brain displayed that occipital lobe was the main active region and frontal lobe responsible for emotional regulation was activated mainly in moments of N1 and P2. The above earlymid ERP component amplitude analyses fully showed differences on cerebral processing mechanism of emotions induced by scene situation and face expression images, which deserved further researches.
|
|
|
|
|
[1]张迪. 情绪脑电特征识别与跨模式分析 [D]. 天津: 天津大学, 2013.
[2]Viviani R. Emotion regulation, attention to emotion, and the ventral attentional network [J]. Human Neuroscience, 2013, 13(3): 359-365.
[3]Jiang Yi, Shannon RW, Vizueta N, et al. Dynamics of processing invisible faces in the brain: automatic neural encoding of facial expression information [J]. Neuroimage, 2009, 44(3): 1171-1177.
[4]Blau VC, Maurer U, Tottenham N, et al. The facespecific N170 component is modulated by emotional facial expression [J]. Behavioral and Brain Functions, 2007, 3(7): 1-13.
[5]Kokal L, Gazzola V, Keysers C. Acting together in and beyond the mirror neuron system [J]. Neuroimage, 2009, 47(4): 2046-2056.
[6]Adolphs, R. Recognizing emotion from facial expressions: psychological and neurological mechanisms [J]. Behavioral and Cognitive Neuroscience Reviews, 2002, 1(1): 21-62.
[7]Thom N, Knight J, Dishman R, et al. Emotional scenes elicit more pronounced selfreported emotional experience and greater EPN and LPP modulation when compared to emotional faces [J]. Cognitive, Affective & Behavioral Neuroscience, 2014, 14(2): 849-860.
[8]Moser JS, Hajcak G, Bukay E, et al. Intentional modulation of emotional responding to unpleasant pictures: an ERP study [J]. Psychophysiology, 2006, 43(3): 292-296.
[9]Schupp HT, Flaisch T, Stockburger J, et al. Emotion and attention: eventrelated brain potential studies [J]. Progress in Brain Research, 2006, 156: 31-51.
[10]Schacht A, Sommer W. Emotions in word and face processing: early and late cortical responses [J]. Brain and Cognition, 2009, 69(3): 538-550.
[11]Olofsson JK, Nordin S, Sequeira H, et al. Affective picture processing: an integrative review of ERP [J]. Biological Psychology, 2008, 77(3): 247-265.[12]Luo Wenbo, Feng Wenfeng, He Weiqi, et al. Three stages of facial expression processing: ERP study with rapid serial visual presentation [J]. Neuroimage, 2010, 49(2): 1857-1867.
[13]赵仑. ERPs实验教程 [M]. 南京: 东南大学出版社, 2010: 43-45. |
|
|
|