MultiMediate: Multi-modal Group Behaviour Analysis for Artificial Mediation

Multi-target Positive Emotion Recognition from EEG Signals

Guozhen Zhao, Yulin Zhang, Guanhua Zhang, Dan Zhang, Yong-Jin Liu

IEEE Transactions on Affective Computing (TAFFC), , pp. 1-13, 2020.


Compared with the widely studied negative emotions in which different classes are easy to distinguish, nowadays less attention is paid to the recognition of positive emotions that are not fully independent. In this paper, we propose to recognize multiple positive emotions by analyzing brain activities and explore the neural representation of different positive emotions. Thirty-seven participants volunteered to participate in our study, in which their brain activities were recorded when watching five selected film clips. First, 150 well-known power features extracted from Electroencephalography (EEG) signals and 105 multimedia content analysis features were collected as the pool of candidate features. Second, based on the collected features, we propose to use a linear model and a nonlinear model to predict the percentage of five positive emotions. Then, percentage values were converted to ranking numbers and Kendall rank correlation coefficients were calculated. Our results showed that (1) ensemble of regressor chains using LSTM as unit regressor obtained both the best regression results and the best Kendall rank correlation coefficient on EEG features merely, and (2) top features from alpha frequency bands of EEG signals could represent different positive emotions. These results demonstrate the effectiveness of selective EEG features on recognizing different positive emotions.



@article{zhao20_taffc, title = {Multi-target Positive Emotion Recognition from EEG Signals}, author = {Zhao, Guozhen and Zhang, Yulin and Zhang, Guanhua and Zhang, Dan and Liu, Yong-Jin}, year = {2020}, journal = {IEEE Transactions on Affective Computing (TAFFC)}, doi = {10.1109/TAFFC.2020.3043135}, pages = {1-13} }