Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Multimodal Emotion Classification in Naturalistic User Behavior
Ist Teil von
Human-Computer Interaction. Towards Mobile and Intelligent Interaction Environments, p.603-611
Ort / Verlag
Berlin, Heidelberg: Springer Berlin Heidelberg
Link zum Volltext
Quelle
Alma/SFX Local Collection
Beschreibungen/Notizen
The design of intelligent personalized interactive systems, having knowledge about the user’s state, his desires, needs and wishes, currently poses a great challenge to computer scientists. In this study we propose an information fusion approach combining acoustic, and bio-physiological data, comprising multiple sensors, to classify emotional states. For this purpose a multimodal corpus has been created, where subjects undergo a controlled emotion eliciting experiment, passing several octants of the valence arousal dominance space. The temporal and decision level fusion of the multiple modalities outperforms the single modality classifiers and shows promising results.