Lu Weikun1, Zhou Yingyue1,2*, Wu Qiao3, Liao xiang4, Liu Qi1, Huang Runxia1, Yang Bo5
1(School of Information and Control Engineering, Southwest University of Science and Technology, Mianyang 621010, Sichuan, China) 2(School of Medicine, Southwest University of Science and Technology, Mianyang 621010, Sichuan, China) 3(The Third People′s Hospital of Mianyang, Mianyang 621010, Sichuan, China) 4(School of Medicine, Chongqing University, Chongqing 400044, China) 5(College of Electronic Engineering, Chengdu University of Information Technology, Chengdu 610225, China)
Abstract:Emotion recognition plays a crucial role in human-computer interaction, and EEG signals have advantages in reflecting human emotional states. Modeling the complex interactions between brain regions based on the spatial topology of EEG can provide essential information for feature extraction in emotion recognition. However, using the single graph structure based on spatial topology for graph convolution suffers from the limitation of a single information aggregation method, making it difficult to describe the complex relationships between EEG channels. To address this issue, we proposed an EEG emotion recognition model based on a graph attention mechanism. The core idea was to use a multi-head graph attention mechanism to dynamically assign weights to node connections, thereby adaptively capturing the relationships between EEG channels and overcoming the shortcomings of existing methods in information aggregation and dynamic pattern capture. The proposed model was validated using 7 424 EEG samples from the DEAP dataset, achieving classification accuracies of 96.06%, 96.54%, and 96.84% on the three emotional dimensions of valence, arousal, and dominance, respectively. Compared to the ELGCNN model, which is also based on graph neural networks, the proposed model demonstrated improvements of 6.20% and 6.56% in the accuracy for the valence and arousal dimensions, proving its effectiveness. Additionally, the model′s performance was further validated using 4 630 EEG samples from the DREAMER dataset, yielding classification accuracies of 87.87%, 83.47%, and 79.96% for the valence, arousal, and dominance dimensions, respectively. The experimental results demonstrate that employing graph attention mechanisms can effectively extracted theemotion-related EEG features and improved the accuracy of EEG-based emotion recognition.
[1] 权学良, 曾志刚, 蒋建华, 等. 基于生理信号的情感计算研究综述[J]. 自动化学报, 2021, 47(8): 1769-1784.
[2] 李锦瑶, 杜肖兵, 朱志亮, 等. 脑电情绪识别的深度学习研究综述[J]. 软件学报, 2022, 34(1): 255-276.
[3] Zheng W, Tang H, Huang TS. Emotion Recognition from Non-Frontal Facial Images[M].Hoboken: Wiley, 2015: 183-213.
[4] Van den Broek EL. Ubiquitous emotion-aware computing[J]. Personal and Ubiquitous Computing, 2013, 17(1): 53-67.
[5] Cannon W B. The James-Lange theory of emotions: a critical examination and an alternative theory[J]. The American Journal of Psychology, 1987, 100(3/4): 567-586.
[6] Lindquist KA, Barrett LF. A functional architecture of the human brain: emerging insights from the science of emotion[J]. Trends in Cognitive Sciences, 2012, 16(11): 533-540.
[7] Jenke R, Peer A, Buss M. Feature extraction and selection for emotion recognition from EEG[J]. IEEE Transactions on Affective computing, 2014, 5(3): 327-339.
[8] Liu Y, Sourina O. Real-time fractal-based valence level recognition from EEG[M]//Transactions on Computational Science XVIII. Berlin, Heidelberg: Springer, 2013: 159-176.
[9] Lin YP, Wang CH, Jung TP, et al. EEG-based emotion recognition in music listening[J]. IEEE Transactions on Biomedical Engineering, 2010, 57(7): 1798-1806.
[10] Duan RN, Zhu JY, Lu BL. Differential entropy feature for EEG-based emotion classification[C]//Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). San Diego: IEEE, 2013: 81-84.
[11] Akin M. Comparison of wavelet transform and FFT methods in the analysis of EEG signals[J]. Journal of Medical Systems, 2002, 26: 241-247.
[12] Zheng WL, Zhu JY, Lu BL. Identifying stable patterns over time for emotion recognition from EEG[J]. IEEE Transactions on Affective Computing, 2017, 10(3): 417-429.
[13] Wu X, Zheng WL, Lu BL. Identifying functional brain connectivity patterns for EEG-based emotion recognition[C]//Proceedings of the 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER). San Francisco IEEE, 2019: 235-238.
[14] Li P, Liu H, Si Y, et al. EEG based emotion recognition by combining functional connectivity network and local activations[J]. IEEE Transactions on Biomedical Engineering, 2019, 66(10): 2869-2881.
[15] Zheng WL, Zhu JY, Peng Y, et al. EEG-based emotion classification using deep belief networks[C]//Proceedings of the 2014 IEEE International Conference on Multimedia and Expo (ICME). Chengdu: IEEE, 2014: 1-6.
[16] Kim B H, Jo S. Deep physiological affect network for the recognition of human emotions[J]. IEEE Transactions on Affective Computing, 2018, 11(2): 230-243.
[17] Li J, Zhang Z, He H. Hierarchical convolutional neural networks for EEG-based emotion recognition[J]. Cognitive Computation, 2018, 10: 368-380.
[18] Bashivan P, Rish I, Yeasin M, et al. Learning representations from EEG with deep recurrent-convolutional neural networks[DB/OL]. https://arxiv.org/abs/1511.06448.2016-02-19/2024-10-01.
[19] Song T, Zheng W, Song P, et al. EEG emotion recognition using dynamical graph convolutional neural networks[J]. IEEE Transactions on Affective Computing, 2018, 11(3): 532-541.
[20] 陈继鑫,朱艳萍,万发雨,等.混合注意力机制下胶囊网络的脑电情绪识别方法[J/OL]. https://doi.org/10.13878/j.cnki.jnuist.20240425001.2024-09-14/2024-10-14.
[21] Zheng W L, Lu B L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks[J]. IEEE Transactions on Autonomous Mental Development, 2015, 7(3): 162-175.
[22] Yang Y, Wu Q, Qiu M, et al. Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network [C]//2018 International Joint Conference on Neural Networks (IJCNN). Rio de Janeiro: IEEE, 2018: 1-7.
[23] Zhang D, Yao L, Zhang X, et al. Cascade and parallel convolutional recurrent neural networks on EEG-based intention recognition for brain computer interface[C]//Proceedings of the AAAI Conference on Artificial Intelligence. New Orleans: AAAI Press, 2018:1703-1710.
[24] Zhang T, Zheng W, Cui Z, et al. Spatial-temporal recurrent neural network for emotion recognition[J]. IEEE Transactions on Cybernetics, 2018, 49(3): 839-847.
[25] Wang Z, Tong Y, Heng X. Phase-locking value based graph convolutional neural networks for emotion recognition[J]. IEEE Access, 2019, 7: 93711-93722.
[26] Zhong P, Wang D, Miao C. EEG-based emotion recognition using regularized graph neural networks[J]. IEEE Transactions on Affective Computing, 2022, 13(3): 1290-1301.
[27] Sander D, Grandjean D, Pourtois G, et al. Emotion and attention interactions in social cognition: brain regions involved in processing anger prosody[J]. Neuroimage, 2005, 28(4): 848-858.
[28] Koelstra S, Muhl C, Soleymani M, et al. Deap: a database for emotion analysis; using physiological signals[J]. IEEE Transactions on Affective Computing, 2011, 3(1): 18-31.
[29] Katsigiannis S, Ramzan N. DREAMER: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices[J]. IEEE Journal of Biomedical and Health Informatics, 2017, 22(1): 98-107.
[30] Si X, Huang D, Liang Z, et al. Temporal aware mixed attention-based convolution and transformer network for cross-subject EEG emotion recognition[J]. Computers in Biology and Medicine, 2024, 181: 108973.
[31] Zhou RS, Ye WS, Zhang ZG, et al. EEGMatch: learning with incomplete labels for semisupervised EEG-Based cross-subject emotion recognition[J]. IEEE Transactions on Neural Networks and Learning Systems, 2025, 36(7): 12991-13005.
[32] Li S, Zhang T, Chen B, et al. MIA-Net: Multi-modal interactive attention network for multi-modal affective analysis[J]. IEEE Transactions on Affective Computing, 2023, 14(4): 2796-2809.
[33] Song Y, Zheng Q, Liu B, et al. EEG conformer: convolutional transformer for EEG decoding and visualization[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2022, 31: 710-719.
[34] Gong P, Wang P, Zhou Y, et al. A spiking neural network with adaptive graph convolution and LSTM for EEG-based brain-computer interfaces[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2023, 31: 1440-1450.
[35] Veličković P, Cucurull G, Casanova A, et al. Graph attention networks[DB/OL]. https://arxiv.org/abs/1710.10903.2018-02-04/2024-10-01.
[36] Wang Y, Shi Y, Cheng Y, et al. A spatiotemporal graph attention network based on synchronization for epileptic seizure prediction[J]. IEEE Journal of Biomedical and Health Informatics, 2022, 27(2): 900-911.
[37] 周如双,赵慧琳,林玮玥,等.基于深浅特征融合的深度卷积残差网络的脑电情绪识别模型[J].中国生物医学工程学报, 2021,40(6):641-652.
[38] Zhang Y, Pan Y, Zhang Y, et al. Unsupervised time-aware sampling network with deep reinforcement learning for eeg-based emotion recognition[J]. IEEE Transactions on Affective Computing, 2023, 15(3): 1090-1103.
[39] Ding Y, Robinson N, Zhang S, et al. TSception: Capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition[J]. IEEE Transactions on Affective Computing, 2022, 14(3): 2238-2250.
[40] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems (NeurIPS 2017). Long Beach: Curran Associates Inc., 2017: 6000-6010.
[41] Song T, Zheng W, Liu S, et al. Graph-embedded convolutional neural network for image-based EEG emotion recognition[J]. IEEE Transactions on Emerging Topics in Computing, 2021, 10(3): 1399-1413.
[42] 赵红宇, 李畅, 刘羽, 等. 基于无源域适应的脑电情绪识别[J]. 中国生物医学工程学报, 2024, 43(2): 129-142.
[43] Yin Y, Zheng X, Hu B, et al. EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM[J]. Applied Soft Computing, 2021, 100: 106954.
[44] Tao W, Li C, Song R, et al. EEG-based emotion recognition via channel-wise attention and self attention[J]. IEEE Transactions on Affective Computing, 2020, 14(1): 382-393.
[45] Quinlan JR. Induction of decision trees[J]. Machine Learning, 1986, 1: 81-106.
[46] Suykens JAK, Vandewalle J. Least squares support vector machine classifiers[J]. Neural Processing Letters, 1999, 9: 293-300.
[47] Saarimäki H, Gotsopoulos A, Jääskeläinen IP, et al. Discrete neural signatures of basic emotions[J]. Cerebral Cortex, 2016, 26(6): 2563-2573.