DREAM: Deep Learning-Based Recognition of Emotions from Multiple Affective Modalities Using Consumer-Grade Body Sensors and Video Cameras

Sharma, Aditi; and Kumar, AkshiORCID logo. 2024. DREAM: Deep Learning-Based Recognition of Emotions from Multiple Affective Modalities Using Consumer-Grade Body Sensors and Video Cameras. IEEE Transactions on Consumer Electronics, 70(1), pp. 1434-1442. ISSN 0098-3063 [Article]
Copy

Ambient smart cities exist on the intersection of digital technology, disruptive innovation and urban environments that now essentially augment affect empathy and intelligent interfacing for human computer interactions (HCI). This research puts forward a deep learning approach, DREAM, for recognition of emotions using three affective modalities (audio, video, physiological) to develop an empathetic HCI system using consumer electronic IoT sensors and cameras. Convolution network is used to train for physiological signals. VGG and ResNet have been used to pre-train the models for emotion recognition from video and audio signals. DREAM is then fine-tuned on the publicly available K-EmoCon dataset to accurately recognize emotion for each subject. K-EmoCon is annotated by seven persons for five discrete emotions, and two affect dimensions. Finally, a probability-based average decision-level fusion strategy is used for combining the outputs of all the modalities. Leave one out strategy is used to train and evaluate the model for subject specific accuracies. For discrete emotions highest accuracy of 81.7% and 82.4% is achieved for dimensional emotions. DREAM has performed better than existing state-of-the-art for both emotion models.


picture_as_pdf
DREAM_Deep_Learning-Based_Recognition_of_Emotions_from_Multiple_Affective_Modalities_Using_Consumer-Grade_Body_Sensors_and_Video_Cameras (1).pdf
subject
Accepted Version

View Download

Atom BibTeX OpenURL ContextObject in Span OpenURL ContextObject Dublin Core Dublin Core MPEG-21 DIDL Data Cite XML EndNote HTML Citation METS MODS RIOXX2 XML Reference Manager Refer ASCII Citation
Export

Downloads