Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
In this paper, a multi-frequency bands residual framework with multilevel scales is proposed for EEG-based emotion recognition. Time-frequency information of EEG signals is captured by calculating the Hjorth mobility (HM) of the Hjorth parameter feature set across various frequency ranges. Additionally, feature matrix based on HM and differential entropy (DE) are constructed to represent the spatial domain information. By combining the DE and HM feature matrixes from each paralleled five frequency bands, the proposed multi-scale residual network can learn feature maps for each frequency band. A lightweight CNN with a full connectivity layer is used for cross-band learning to complete emotion classification. The performance of the emotion classification strategy is evaluated using the SEED and SEED-IV datasets, and achieves an outperforming classification accuracy. Furthermore, by analysing the normalized HM values, it was found that there is greater volatility and degree of fluctuation in the gamma and beta frequency bands, which are helpful for the EEG emotion classification task.