Real-time multimodal emotion classification system in E-Learning context
10.1007/978-3-030-80568-5_35
Inclou dades d'ús des de 2022
Cita com:
hdl:2117/349313
Tipus de documentText en actes de congrés
Data publicació2021
EditorSpringer
Condicions d'accésAccés obert
Tots els drets reservats. Aquesta obra està protegida pels drets de propietat intel·lectual i
industrial corresponents. Sense perjudici de les exempcions legals existents, queda prohibida la seva
reproducció, distribució, comunicació pública o transformació sense l'autorització del titular dels drets
Abstract
Emotions of learners are crucial and important in e-learning as they promote learning. To investigate the effects of emotions on improving and optimizing the outcomes of e-learning, machine learning models have been proposed in the literature. However, proposed models so far are suitable for offline mode, where data for emotion classification is stored and can be accessed boundlessly. In contrast, when data arrives in a stream, the model can see the data once and real-time response is required for real-time emotion classification. Additionally, researchers have identified that single data modality is incapable of capturing the complete insight of the learning experience and emotions. So, multi-modal data streams such as electroencephalogram (EEG), Respiratory Belt (RB), electrodermal activity data (EDA), etc., are utilized to improve the accuracy and provide deeper insights in learners’ emotion and learning experience. In this paper, we propose a Real-time Multimodal Emotion Classification System (ReMECS) based on Feed-Forward Neural Network, trained in an online fashion using the Incremental Stochastic Gradient Descent algorithm. To validate the performance of ReMECS, we have used the popular multimodal benchmark emotion classification dataset called DEAP. The results (accuracy and F1-score) show that the ReMECS can adequately classify emotions in real-time from the multimodal data stream in comparison to the state-of-the-art approaches.
CitacióNandi, A. [et al.]. Real-time multimodal emotion classification system in E-Learning context. A: International Conference on Engineering Applications of Neural Networks. "Proceedings of the 22nd Engineering Applications of Neural Networks Conference, EANN 2021". Berlín: Springer, 2021, p. 423-435. ISBN 978-3-030-80568-5. DOI 10.1007/978-3-030-80568-5_35.
ISBN978-3-030-80568-5
Versió de l'editorhttps://link.springer.com/chapter/10.1007%2F978-3-030-80568-5_35
Fitxers | Descripció | Mida | Format | Visualitza |
---|---|---|---|---|
_PUBLISHED_EANN ... version__Camera_ready_.pdf | 714,0Kb | Visualitza/Obre |