|
|
Feature Fusion Based Deep Residual Networks Using Deep and Shallow Learning for EEG-Based Emotion Recognition |
Zhou Rushuang1,2,3&, Zhao Huilin1,2,3&, Lin Weiyue1,2,3&, Hu Wanrou1,2,3, Zhang Li1,2,3, Huang Gan1,2,3, Li Linling1,2,3, Zhang Zhiguo1,2,3, Liang Zhen1,2,3* |
1(School of Biomedical Engineering, Health Science Center, Shenzhen University, Shenzhen 518071, Guangdong, China) 2(National Regional Key Technology Engineering Laboratory for Medical Ultrasound, Shenzhen 518071, Guangdong, China) 3(Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging, Shenzhen 518071, Guangdong, China) |
|
|
Abstract Electroencephalography (EEG) has advantages of portability, high temporal resolution and real-time operation, therefore has been used to recognize, monitor, and track human's emotion in the fields of healthcare, entertainment, education and so on. However, due to the non-stationarity and individual differences in EEG signals, it is difficult to effectively and efficiently extract informative and useful emotion related characteristics using traditional methods. To obtain representative features in an efficient manner and improve emotion classification accuracy, we proposed a feature fusion based deep residual networks using deep and shallow learning for EEG-based emotion recognition. The proposed model consisted of three modules: a shallow feature extraction module, a deep feature extraction module, and a classification module. First, the shallow feature extraction module was designed with multiple convolution layers to extract shallow tempo-spatial features. Second, the deep feature extraction module employed a Bi-GRU layer and the attention mechanism to extract deep tempo-spatial features from the extracted shallow features. Third, the classification module was designed with a fully connected layer for binary classification. The proposed model used a subject-based leave-one-out cross-validation on the DEAP database with 76 800 samples, and achieved a good performance in emotion recognition with the binary classification accuracy of 96.95% for valence and 97.22% for arousal. Comparing to the existing methods, our proposed model increased the accuracies by 3.53% and 4.25% for valence and arousal, respectively. Further, the good performance of the proposed model in binary emotion classification was also validated in MAHNOB-HCI and SEED databases as well.
|
Received: 08 May 2021
|
|
|
|
|
[1] Dolan RJ. Emotion, cognition, and behavior [J]. Science, 2002, 298(5596): 1191-1194. [2] Hogan P. The Mind and Its Stories: Narrative Universals and Human Emotion [M]. Cambridge: Cambridge University Press, 2003. [3] Lewis M, Haviland-Jones JM, Barrett LF. Handbook of Emotions [M]. 4th Edition. New York: Guilford Press, 2016. [4] Ekman P, Friesen WV, O'sullivan M, et al. Universals and cultural differences in the judgements of facial expressions of emotion [J]. Journal of Personality and Social Psychology, 1987, 53(4): 712-717. [5] Rusell JA. A circumplex model of affect [J]. Journal of Personality and Social Psychology, 1980, 39(6): 1161-1178. [6] Verma GK, Tiwary US. Affect representation and recognition in 3D continuous valence-arousal-dominance space [J]. Multimedia Tools and Applications, 2017, 76(2): 2159-2183. [7] Teplan M. Fundamental of EEG measurement [J]. Science Review, 2002, 2(2): 1-11. [8] Alarcão SM, Fonseca MJ. Emotions recognition using EEG signals: a survey [J]. IEEE Transactions on Affective Computing, 2019, 10(3): 374-393. [9] Liu Yisi, Sourina O, Nguyen MK. Real-time EEG-based emotion recognition and its applications [C]//Gavrilova ML, Tan CJK, eds. Transactions on Computational Science Ⅻ. Berlin: Springer-Verlag, 2011: 256-277. [10] Hondrou C, Caridakis George. Affective, natural interaction using EEG: sensors, application and future directions [C]//Artificial Intelligence: Theories and Applications. Berlin: Springer-Verlag, 2012: 331-338. [11] Haufe S, Nikulin VV, Müller KR, et al. A critical assessment of connectivity measures for EEG data: a simulation study [J]. NeuroImage, 2013, 64: 120-133. [12] Milz P, Faber PL, Lehmann D, et al. The functional significance of EEG microstates-associations with modalities of thinking [J]. NeuroImage, 2016, 125: 643-656. [13] Castelnovo A, Riedner BA, Smith RF, et al. Scalp and source power topography in sleepwalking and sleep terrors: a high-density EEG study [J]. Sleep, 2016, 39(10): 1815-1825. [14] Ma Xiaofei, Huang Xiaolin, Shen Yuxiaotong, et al. EEG based topography analysis in string recognition task [J]. Physica A: Statistical Mechanics and its Applications, 2017, 469: 531-539. [15] Ramos-Aguilar R, Olvera-López JA, Olmos-Pineda I, et al. Feature extraction from EEG spectrograms for epileptic seizure detection [J]. Pattern Recognition Letters, 2020, 133: 202-209. [16] Islam MR, Ahmad M. Wavelet analysis based classification of emotion from EEG signal [C]//International Conference on Electrical, Computer and Communication Engineering (ECCE). Cox's Bazar: IEEE, 2019: 1-6. [17] Ang KK, Chin ZY, Zhang Haihong, et al. Mutual information-based selection of optimal spatial-temporal patterns for single-trial EEG-based BCIs [J]. Pattern Recognition, 2012, 45(6): 2137-2144. [18] Jirayucharoensak S, Pan-Ngum S, Israsena P. EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation [J]. The Scientific World Journal, 2014, 627892: 1-11. [19] Zheng Weilong, Lu Baoliang. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks [J]. IEEE Transactions on Autonomous Mental Development, 2015, 7(3): 162-175. [20] Zeng Hong, Wu Zhenhua, Zhang Jiaming, et al. EEG emotion classification using an improved SincNet-based deep learning model [J]. Brain Sciences, 2019, 9(11): 326. [21] Song Tengfei, Zheng Wenming, Song Peng, et al. EEG emotion recognition using dynamical graph convolutional neural networks [J]. IEEE Transactions on Affective Computing, 2020, 11(3): 532-541. [22] Badicu B, Udrea A. Cross-subjects emotions classification from EEG signals using a hierarchical LSTM based classifier [C]//E-Health and Bioengineering Conference (EHB). Iasi: IEEE, 2019: 1-4. [23] Zhong Xiaolong, Yin Zhong, Zhang Jianhua. Cross-subject emotion recognition from EEG using convolutional neural networks [C]//Chinese Control Conference (CCC). Shenyang: IEEE, 2020: 7516-7521. [24] 李颖洁, 李玉玲, 杨帮华. 基于脑电信号深度学习的情绪识别研究现状 [J]. 北京生物医学工程, 2020, 39(6): 634-642. [25] Duan Ruonan, Zhu Jiayi, Lu Baoliang. Differential entropy feature for EEG-based emotion classification [C]//International IEEE/EMBS Conference on Neural Engineering. San Diego: IEEE, 2013: 81-84. [26] Koelstra S, Muhl C, Soleymani M, et al. DEAP: A database for emotion analysis; using physiological signals [J]. IEEE Transactions on Affective Computing, 2012, 3(1): 18-31. [27] Soleymani M, Lichtenauer J, Pun T, et al. A multimodal database for affect recognition and implicit tagging [J]. IEEE Transactions on Affective Computing, 2012, 3(1): 42-55. [28] Bradley MM, Lang PJ. Measuring emotion: The self-assessment manikin and the semantic differential [J]. Journal of Behavior Therapy and Experimental Psychiatry, 1994, 25(1): 49-59. [29] Mehrabian A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament [J]. Current Psychology, 1996, 14(4): 261-292. [30] Bakker I, van der Voordt T, Vink P, et al. Pleasure, arousal, dominance: Mehrabian and Russell revisited [J]. Current Psychology, 2014, 33(3): 405-421. [31] He Kaiming, Zhang Xiangyu, Ren Shaoqing, et al. Deep residual learning for image recognition [C]//Conference on Computer Vision and Pattern Recognition (CVPR). Las Vegas: IEEE, 2016: 770-778. [32] Ioffe S, Szegedy C. Batch normalization: accelerating deep network training by reducing internal covariate shift [C]//International Conference on Machine Learning (ICML). Lille: Proceedings of Machine Learning Research, 2015: 448-456. [33] Ulyanov D, Vedaldi A, Lempitsky V. Instance normalization: the missing ingredient for fast stylization [EB/OL]. https://arxiv.org/abs/1607.08022, 2017-11-06/2021-05-06. [34] Guo Jianbo, Li Yuxi, Lin Weiyao, et al. Network decoupling: from regular to depthwise separable convolutions [DB/OL]. https://arxiv.org/abs/1808.05517, 2018-08-16/2021-05-06. [35] Hochreiter S, Schmidhuber J. Long short-term memory [J]. Neural Computation, 1997, 9(8): 1735-1780. [36] LeCun Y, Bengio Y, Hinton G. Deep learning [J]. Nature, 2015, 512: 436-444. [37] Chung JY, Gulcehre C, Cho KH, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[DB/OL]. https://arxiv.org/abs/1412.3555v1, 2014-12-11/2021-05-06. [38] Chen Jingxia, Jiang Dongmei, Zhang Yanning. A hierarchical bidirectional GRU model with attention for EEG-based emotion classification [J]. IEEE Access, 2019, 7: 118530-118540. [39] Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks [C]//International Conference on Aritificial Intelligence and Statistics (AISTATS). Sardinia: Proceedings of Machine Learning Research, 2010: 249-256. [40] Kingma D, Ba J. Adam: A method for stochastic optimization[DB/OL]. https://arxiv.org/abs/1412.6980v5, 2017-01-30/2021-05-06. [41] Chen Jingxia, Jiang Dongmei, Zhang Yanning, et al. Emotion recognition from spatiotemporal EEG representations with hybrid convolutional recurrent neural networks via wearable multi-channel headset [J]. Computer Communications, 2020, 154: 58-65. [42] Alhagry S, Fahmy AA, EI-Khoribi RA. Emotion recognition based on EEG using LSTM recurrent neural network [J]. International Journal of Advanced Computer Science and Applications, 2017, 8(10): 355-358. [43] Shawky ES, El-Khoribi RA, Shoman ME, et al. EEG-based emotion recognition using 3D convolutional neural networks [J]. International Journal of Advanced Computer Science and Applications, 2018, 9(8): 329-337. [44] Chen Jingxia, Zhang Pengwei, Mao Zijing, et al. Accurate EEG-based emotion recognition on combined features using deep convolutional neural networks [J]. IEEE Access, 2019, 7: 44317-44328. [45] Allen JJB, Keune PM, Schönenberg M, et al. Frontal EEG alpha asymmetry and emotion: from neural underpinnings and methodological considerations to psychopathology and social cognition [J]. Psychophysiology, 2018, 55(1): 1-6. [46] Gonuguntla V, Mallipeddi R, Veluvolu KC. Identification of emotion associated brain functional network with phase locking value [C]//2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). Orlando: IEEE, 2016: 4515-4518. [47] Wu Xun, Zheng Weilong, Lu Baoliang. Identifying functional brain connectivity patterns for EEG-based emotion recognition [C]//2019 9th International IEEE/EMBS Conference on Neural Engineering (NER). San Francisco: IEEE, 2019: 235-238. [48] Thiruchselvam R, Blechert J, Sheppes G, et al. The temporal dynamics of emotion regulation: an EEG study of distraction and reappraisal [J]. Biological Psychology, 2011, 87(1): 84-92. [49] Güntekin B, Tülay E. Event related beta and gamma oscillatory responses during perception of affective pictures [J]. Brain Research, 2014, 1577(1): 45-56. [50] Dahal N, Nandagopal N, Nafalski A, et al. Modeling of cognition using EEG: a review and a new approach [C]//TENCON 2011-2011 IEEE Region 10 Conference. Bali: IEEE, 2011: 1045-1049. [51] Başar E, Güntekin B. A short review of alpha activity in cognitive processes and in cognitive impairment [J]. International Journal of Psychophysiology, 2012, 86(1): 25-38. |
[1] |
Shi Xianle, Xu Rui, Wang Yaoyao, Meng Lin, Liu Yuan, Ming Dong. Influence Factors of Corticomuscular Coherence and Rehabilitation Application[J]. Chinese Journal of Biomedical Engineering, 2021, 40(6): 743-751. |
[2] |
Zhao Jie, Jin Yajuan, Zhang Zhiming, Wan Lingyan, Li Xiaoli, Kang Jiannan. Study on Multi-Feature Fusion of EEG to Evaluate Children with Autism[J]. Chinese Journal of Biomedical Engineering, 2021, 40(5): 550-558. |
|
|
|
|