Abstract:The eventrelated potentials (ERP) technique was used to analyze the early and late characteristics of facial expressions processing, to investigate the mechanisms underlying structure encoding and feature encoding. Three kinds of cartoon facial expressions (neutral, happy, angry) and three kinds of nonfacial cartoons were presented randomly, 14 volunteers were instructed to implement face and facial expression recognition tasks. Electroencephalography (EEG) was record with Neuroscan-64 cap. We analyzed the early component N170 and the late positive component (LPC) evoked by faces, nonfaces, and different emotional faces. Results indicated that there were significant differences between N170 evoked by facial cartoons and that evoked by nonfacial cartoons. The amplitudes of N170 at P7 evoked by facial cartoons and nonfacial cartoons were (-915±147) μV and (-691±121) μV. There were not significant differences among N170 evoked by different emotional faces, N170 was not regulated by facial emotion content. Expressional effects appeared at 350~550 ms after stimulus, there were significant differences among LPC evoked by different emotional faces. The amplitudes of LPC at CZ evoked by neutral, happy and angry facial expressions were (611±179)μV, (749±131) μV and (989±177)μV, respectively. These results suggest that early and late processing mechanisms of face recognition are different, the early component N170 reflects the processing of face structure encoding, so called the characterization of the whole face contour; while face emotional processing is mainly reflected in the late component, LPC reflects mechanism of face feature encoding.
[1]Bruce V, Young AW. Understanding face recognition [J]. British of Journal Psychology, 1986, 77(3): 305-327.
[2]Eimer M, Holmes A. An ERP study on the time course of emotional face processing [J]. Neuroreport
, 2002, 13(4): 427-431.
[3]Tanaka JW, Pierce LJ. The neural plasticity of otherrace face recognition [J]. Cognitive Affective and Behavioural Neuroscience, 2009, 9(1): 122-131.
[4]Mouchetant RY, Giard MH, Delpuech C, et al. Early signs of visual categorization for biological and nonbiological stimuli in humans [J]. Neuroreport, 2000, 11(11): 2521-2525.
[5]Cauquil AS, Edmonds GE, Tayylor MJ. Is the facesensitive N170 the only ERP not affected by selective attention?[J]Neuroreport, 2000, 11(10): 2167-2171.
[6]王军利, 王静梅, 任静婷, 等. 卡通面孔分类与知觉加工的ERP比较研究 [J]. 心理科学,2013, 36(2): 320-327.
[7]Batty M, Taylor MJ. Early processing of the six basic facial emotional expressions [J]. Cognitive Affective and Behavioural Neuroscience, 2003, 17(3): 613-620.
[8]Blau VC, Maurer U, Tottenham N, et al. The facespecific N170component is modulated by emotional facial Expression [J]. Behavioral and Brain Functions, 2007, 3(7): 1-13.
[9]Jiang Yi, Shannon RW, Vizueta N, et al. Dynamics of processing invisible faces in the brain: Automatic neural encoding of facial expression information [J]. NeuroImage, 2009, 44(3): 1171-1177.
[10]Calvo MG, Beltrán D. Recognition advantage of happy faces: tracing the neurocognitive processes [J]. Neuropsychologia, 2013,〖STHZ〗51〖STBZ〗(11): 2051-2061.
[11]袁加锦, 汪宇, 鞠恩霞, 等. 情绪加工的性别差异及神经机制 [J]. 心理科学进展, 2010, 18(12): 1899-1908.
[12]盖毅, 李颖洁, 朱贻盛, 等. 面孔认知中的性别差异 [J]. 生物医学工程学杂志, 2009, 26(1): 47-49.
[13]Freeman JB, Ambady N, Holcomb PJ. The facesensitive N170 encodes social categoryinformation [J]. NeuroReport, 2010, 21(1):24-28.
[14]Yan Chen, Fada P, Huarong W, et al. Electrophysiological Correlates of Processing Own and OtherRace Faces [J]. Brain Topography,2013, 26(4): 606-615.
[15]Jehna M, Neuper C, Ischebeck A, et al. The functional correlates of face perception and ecognition of emotional facial expressions as evidenced by fMRI [J]. Brain Res, 2011, 1393(6): 73-83.
[16]蒋长好,赵仑,郭德俊, 等. 面孔加工的情绪效应和效价效应 [J]. 中国临床心理学杂志,2008,16(3): 237-239.