Temporal analysis of functional brain connectivity for EEG-based emotion recognition

Document Type : Research Article

Authors

Department of Electrical Engineering, Sharif University of Technology, Tehran, Iran

Abstract

EEG signals in emotion recognition absorb special attention owing to their high temporal resolution and their information about brain activity. Different brain areas work together and the activity of brain changes over time. In this study, we investigate the emotion classification performance using functional connectivity features in different frequency bands and compare them with the classification performance using differential entropy feature, which has been previously used for this task. Moreover, we investigate the effect of different time periods on classification performance. Our results on SEED dataset show that as time goes on, emotions become more stable and the classification accuracy increases. Among different time periods, we achieve the highest classification accuracy using the time period of 140s-end. In this time period, the accuracy is improved by 4 to 6% compared to the entire signal. Pearson correlation coefficient, coherence and phase locking value features and SVM obtain the mean accuracy of about 88%. Using the proposed framework, functional connectivity features lead to better classification accuracy than DE features (with the mean accuracy of 84.89%). Finally, using the best time interval and SVM, we achieve better accuracy than using RNNs which need large amount of data and have high computational cost.

Keywords

Main Subjects


References:
1. Chao, H., Zhi, H., Dong, L., et al. "Recognition of emotions using multichannel EEG data and DBNGC- based ensemble deep learning framework", Comput. Intell. Neurosci., 2018, pp. 1-11 (2018). DOI: 0.1155/2018/9750904.
2. Zheng, W.-L. and Lu, B.-L. "Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks", IEEE Trans. Auton. Ment. Dev., 7(3), pp. 162-175 (2015). DOI: 10.1109/TAMD.2015.2431497.
3. Duan, R.-N., Zhu, J.-Y., and Lu, B.-L. "Differential entropy feature for EEG-based emotion classification", 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, pp. 81-84 (2013). DOI: 10.1109/NER.2013.6695876.
4. Wang, F., Zhong, S., Peng, J., et al. "Data augmentation for EEG-based emotion recognition with deep convolutional neural networks", MultiMedia Modeling, 17, pp. 82-93 (2018). DOI: 10.1007/978-3-319-73600- 6_8.
5. Moon, S.-E., Jang, S., and Lee, J.-S. "Convolutional neural network approach for EEG-based emotion recognition using brain connectivity and its spatial information", 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Calgary, AB, Canada, pp. 2556-2560 (2018). DOI: 10.1109/ICASSP.2018.8461315.
6. Song, T., Zheng, W., Song, P., et al. "EEG emotion recognition using dynamical graph convolutional neural networks", IEEE Trans. Affect. Comput., 11(3), pp. 532-541 (2020). DOI: 10.1109/TAFFC.2018.2817622.
7. Lin, Y.-P., Wang, C.-H., Jung, T.-P., et al. "EEGbased emotion recognition in music listening", IEEE Trans. Biomed. Eng., 57(7), pp. 1798-1806 (2010). DOI: 10.1109/TBME.2010.2048568.
8. Menting-Henry, S., Hidalgo-Lopez, E., Aichhorn, M., et al. "Oral contraceptives modulate the relationship between resting brain activity, amygdala connectivity and emotion recognition, A Resting State fMRI Study", Frontiers in Behavioral Neuroscience, 16, 775796 (2022). DOI: 10.3389/fnbeh.2022.775796.
9. Gannouni, S., Al-Edaily, A., Belwafi, K., et al. "Emotion detection using electroencephalography signals and a zero-time windowing-based epoch estimation and relevant electrode identification", Scientific Reports, 11(1), 7071 (2021). DOI: 10.1038/s41598-021-86345-5.
10. Hattingh, C.J., Ipser, J., Tromp, S.A., et al. "Functional magnetic resonance imaging during emotion recognition in social anxiety disorder: an activation likelihood meta-analysis", Frontiers in Human Neuroscience, 6, 347 (2013). DOI: 10.3389/fnhum. 2012.00347.
11. Li, C.M., Wang, B., Zhang, S., et al. "Emotion recognition from EEG based on multi-task learning with capsule network and attention mechanism", Computers in Biology and Medicine, 143, 105303 (2022). DOI: 10.1016/j.compbiomed.2022.105303.
12. Li, C.M., Zhang, Z., Zhang, X., et al. "EEGbased emotion recognition via transformer neural architecture search", IEEE Transactions on Industrial Informatics, 19(4), pp. 6016-6025 (2022). DOI: 10.1109/TII.2022.3170422.
13. Malmivuo, J. "Comparison of the properties of EEG  and MEG in detecting the electric activity of the brain", Brain Topography, 25(1), pp. 1-19 (2012). DOI: 10.1007/s10548-011-0202-1.
14. Van den Broek, E.L. "Ubiquitous emotion-aware computing", Pers. Ubiquitous Comput., 17(1), pp. 53-67 (2013). DOI: 10.1007/s00779-011-0479-9.
15. Mehrabian, A. "Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament", Curr. Psychol., 14(4), pp. 261-292 (1996). DOI: 10.1007/BF02686918.
16. Khosrowabadi, R., Heijnen, M., Wahab, A., et al. "The dynamic emotion recognition system based on functional connectivity of brain regions", 2010 IEEE Intelligent Vehicles Symposium, La Jolla, CA, USA, pp. 377-381 (2010). DOI: 10.1109/IVS.2010.5548102.
17. Chi, Y.M., Wang, Y.-T., Wang, Y., et al. "Dry and noncontact EEG sensors for mobile braincomputer interfaces", IEEE Trans. Neural Syst. Rehabil. Eng., 20(2), pp. 228-235 (2012). DOI: 10.1109/TNSRE.2011.2174652.
18. Wang, L.-F., Liu, J.-Q., Yang, B., et al. "PDMSbased low cost  flexible dry electrode for long-term EEG measurement", IEEE Sens. J., 12(9), pp. 2898-2904 (2012). DOI: 10.1109/JSEN.2012.2204339.
19. Huang, Y.-J., Wu, C.-Y., Wong, A.M.-K., et al. "Novel active comb-shaped dry electrode for EEG measurement in hairy site", IEEE Trans. Biomed. Eng., 62(1), pp. 256-263 (2015). DOI: 10.1109/TBME.2014.2347318.
20. https://bcmi.sjtu.edu.cn/home/seed/ 21. Stam, C.J., Nolte, G., and Daffertshofer, A. "Phase lag index: Assessment of functional connectivity from multi channel EEG and MEG with diminished bias from common sources", Hum. Brain Mapp., 28(11), pp. 1178-1193 (2007). DOI: 10.1002/hbm.20346.
22. Li, P., Liu, H., Si, Y., et al. "EEG based emotion recognition by combining functional connectivity network and local activations", IEEE Trans. Biomed. Eng., 66(10), pp. 2869-2881 (2019). DOI: 10.1109/TBME.2019.2897651.
23. Thammasan, N., Moriyama, K., Fukui, K.-I., et al. "Familiarity effects in EEG-based emotion recognition", Brain Inform., 4(1), pp. 39-50 (2017). DOI: 10.1007/s40708-016-0051-5.
24. Zheng, W.-L., Zhu, J.-Y., and Lu, B.-L. "Identifying stable patterns over time for emotion recognition from EEG", IEEE Trans. Affect. Comput., 10(3), pp. 417- 429 (2019). DOI: 10.1109/TAFFC.2017.2712143.
25. Koelstra, S., Muehl, C., Soleymani, M., et al. "DEAP: A dataset for emotion analysis using physiological and audiovisual signals", IEEE Transactions on Affective Computing, 3(1), pp. 18-31 (2012).
26. Katsigiannis, S., and Ramzan, N. "DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-theshelf devices", IEEE Journal of Biomedical and Health Informatics, 22(1), pp. 98-107 (2018). DOI: 10.1109/JBHI.2017.2688239.
27. Eysenck, S.B.G., Eysenck, H.J., and Barrett, P. "A revised version of the psychoticism scale", Personality and Individual Differences, 6(1), pp. 21-29 (1985). DOI: 10.1016/0191-8869(85)90026-1.
28. Singh, B., and Wagatsuma, H. "A removal of eye movement and blink artifacts from EEG data using Morphological Component Analysis", Comput. Math. Methods Med., 2017, 1861645 (2017). DOI: 10.1155/2017/1861645.
29. Bastos, A.M. and Schoffelen, J.-M. "A tutorial review of functional connectivity analysis methods and their interpretational pitfalls", Front. Syst. Neurosci., 9, p. 175 (2015). DOI: 10.3389/fnsys.2015.00175.
30. Lachaux, J.P., Rodriguez, E., Martinerie, J., et al. "Measuring phase synchrony in brain signals", Hum. Brain Mapp., 8(4), pp. 194-208 (1999). DOI: 10.1002/(SICI)1097-0193(1999)8:4 194::AIDHBM4? 3.0.CO;2-C.
31. Chen, Y.-W. and Lin, C.-J. "Combining SVMs with various feature selection strategies", Feature Extraction, Berlin, Heidelberg: Springer Berlin Heidelberg, 207, pp. 315-324 (2008). DOI: 10.1007/978-3-540- 35488-8 13.
32. Lipton, Z.C., Berkowitz, J., and Elkan, C. "A critical review of recurrent neural networks for sequence learning", arXiv [cs.LG] (2015). 
33. Jain, L.C. and Medsker, L.R., Recurrent Neural Networks: Design and Applications, 1st Ed., Boca Raton, FL: CRC Press (1999). 
34. Liu, Z., Yang, M., Wang, X., et al. "Entity recognition from clinical texts via recurrent neural network", BMC Med. Inform. Decis. Mak., 17(S2), pp. 53-61 (2017). DOI: 10.1186/s12911-017-0468-7.
35. Wu, Y.-C., Yin, F., Chen, Z., et al. "Handwritten Chinese text recognition using separable multidimensional recurrent neural network", 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR), Kyoto, Japan, pp. 79-84 (2017). DOI: 10.1109/ICDAR.2017.22.
36. Sutskever, I., Martens, J., and Hinton, G.E. "Generating text with recurrent neural network", In Proceedings of the 28th International Conference on International Conference on Machine Learning (ICML'11), Madison, WI, USA, pp. 1017-1024 (2011). DOI: 10.5555/3104482.3104610.
37. Yang, Z.-L., Guo, X.-Q., Chen, Z.-M., et al. "RNNstega: Linguistic steganography based on recurrent neural networks", IEEE Trans. Inf. Forensics Secur., 14(5), pp. 1280-1295 (2019). DOI: 10.1109/TIFS.2018.2871746.
38. Lu, L., Zhang, X., and Renais, S. "On training the recurrent neural network encoder-decoder for large vocabulary end-to-end speech recognition", In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, pp. 5060-5064 (2016). DOI: 10.1109/ICASSP.2016.7472641.
39. Guo, T., Xu, Z., Yao, X., et al. "Robust online time series prediction with recurrent neural networks", In 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA), Montreal, QC, Canada, pp. 816-825 (2016). DOI: 10.1109/DSAA.2016.92.
40. Vidyaratne, L., Glandon, A., Alam, M., et al. "Deep recurrent neural network for seizure detection", In 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, pp. 1202- 1207 (2016). DOI: 10.1109/IJCNN.2016.7727334.
41. Dey, R. and Salem, F.M. "Gate-variants of Gated Recurrent Unit (GRU) neural networks", 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, pp. 1597- 1600 (2017). DOI: 10.1109/MWSCAS.2017.8053243.
42. Gruber, N. and Jockisch, A. "Are GRU cells more specific and LSTM cells more sensitive in motive classification of text", Front Artif Intell, 3, p. 40 (2020). DOI: 10.3389/frai.2020.00040.
43. Nguyen, M.V., Lai, V.D., Veyseh, A.P.B., et al. "Trankit: A light-weight transformer-based toolkit for multilingual natural language processing", arXiv Preprint arXiv:2101.03289 (2021).
44. Dosovitskiy, A., Beyer, L., Kolesnikov, A., et al. "An image is worth 1616 words: Transformers for image recognition at scale", arXiv preprint arXiv:2010.11929 (2020).
45. Kostas, D., Aroca-Ouellette, S., and Rudzicz, F. "BENDR: Using transformers and a contrastive selfsupervised learning task to learn from massive amounts of EEG data", Frontiers in Human Neuroscience, 15, 653659 (2021). DOI: 10.3389/fnhum.2021.653659.
46. Li, X., Song, D., Zhang, P., et al. "Exploring EEG features in cross-subject emotion recognition", Front. Neurosci., 12, p. 162 (2018). DOI: 10.3389/fnins.2018.00162.
47. He, K., Zhang, X., Ren, S., et al. "Deep residual learning for image recognition", 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, pp. 770-778 (2016).
48. Wu, X., Zheng, W.-L., and Lu, B.-L. "Identifying functional brain connectivity patterns for EEG-based emotion recognition", 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), San Francisco, CA, USA, pp. 235-238 (2019). DOI: 10.1109/NER.2019.8717035.
Volume 32, Issue 1
Transactions on Computer Science & Engineering and Electrical Engineering
January and February 2025 Article ID:6664
  • Receive Date: 02 April 2022
  • Revise Date: 17 May 2023
  • Accept Date: 21 February 2024