Filters








885 Hits in 5.7 sec

The Psychological and Physiological Part of Emotions: Multimodal Approximation for Valence Classification [article]

Jennifer Sorinas, Jose Manuel Ferrández, Eduardo Fernandez
2019 arXiv   pre-print
After feature selection of each methodology, the results of the classification showed that, although the classification of emotions is possible at both central and peripheral levels, the multimodal approach  ...  of valence.  ...  Organization (ONCE) and the Seneca Foundation -Agency of Science and Technology of the Region of Murcia.  ... 
arXiv:1905.00231v1 fatcat:5zpru6hcizcsfn2l725dvzmmdy

The Psychological and Physiological Part of Emotions: Multimodal Approximation for Valence Classification [article]

Jennifer Sorinas Nerin, Jose Manuel Ferrandez, Eduardo Fernandez
2019 bioRxiv   pre-print
After feature selection of each methodology, the results of the classification showed that, although the classification of emotions is possible at both central and peripheral levels, the multimodal approach  ...  of valence.  ...  Organization (ONCE) and the Seneca Foundation -Agency of Science and Technology of the Region of Murcia.  ... 
doi:10.1101/638239 fatcat:o3sgqhsaezbgnfydp7c3mfiax4

Brain and Body Emotional Responses: Multimodal Approximation for Valence Classification

Jennifer Sorinas, Jose Manuel Ferrández, Eduardo Fernandez
2020 Sensors  
After feature selection of each methodology, the results of the classification showed that, although the classification of emotions is possible at both central and peripheral levels, the multimodal approach  ...  of valence.  ...  Multimodal Approximation In previous work [29] , we proposed an EEG-based model for the classification of positive and negative emotions.  ... 
doi:10.3390/s20010313 pmid:31935909 fatcat:63czsq5jora7xcn2y4kgce3ztq

Automatic, Dimensional and Continuous Emotion Recognition

Hatice Gunes, Maja Pantic
2010 International Journal of Synthetic Emotions  
The design of emotion-specific classification schemes that can handle multimodal and spontaneous data is one of the most important issues in the field.  ...  However, new forms of non-contact physiological sensing might facilitate better utilisation of psychological signals as input to multimodal affect recognition systems.  ... 
doi:10.4018/jse.2010101605 fatcat:hipfyafiybfl5fk2ag6gvflm24

AVEC 2016 - Depression, Mood, and Emotion Recognition Workshop and Challenge [article]

Michel Valstar, Jonathan Gratch, Bjorn Schuller, Fabien Ringeval, Denis Lalanne, Mercedes Torres Torres, Stefan Scherer, Guiota Stratou, Roddy Cowie, Maja Pantic
2016 arXiv   pre-print
for automatic audio, visual and physiological depression and emotion analysis, with all participants competing under strictly the same conditions.  ...  The goal of the Challenge is to provide a common benchmark test set for multi-modal information processing and to bring together the depression and emotion recognition communities, as well as the audio  ...  Acknowledgements The research leading to these results has received funding from the EC's 7th Framework Programme through the ERC Starting  ... 
arXiv:1605.01600v4 fatcat:j5bbsbjijzbgxh5zpclfksr4vu

Exploring the contextual factors affecting multimodal emotion recognition in videos [article]

Prasanta Bhattacharya, Raj Kumar Gupta, Yinping Yang
2021 arXiv   pre-print
, and ii) duration of the emotional episode.  ...  This study analyzes the interplay between the effects of multimodal emotion features derived from facial expressions, tone and text in conjunction with two key contextual factors: i) gender of the speaker  ...  We also found that for neutral and positively valenced emotions, like happiness, the improvements in multimodal classification over unimodal classification were higher, as reflected by a higher MM1 score  ... 
arXiv:2004.13274v5 fatcat:nb4tst75czbqrji2sbaqijjjqa

Modeling emotion in complex stories: the Stanford Emotional Narratives Dataset [article]

Desmond C. Ong, Zhengxuan Wu, Tan Zhi-Xuan, Marianne Reddan, Isabella Kahhale, Alison Mattek, Jamil Zaki
2019 arXiv   pre-print
We then introduce the first version of the Stanford Emotional Narratives Dataset (SENDv1): a set of rich, multimodal videos of self-paced, unscripted emotional narratives, annotated for emotional valence  ...  We demonstrate several baseline and state-of-the-art modeling approaches on the SEND, including a Long Short-Term Memory model and a multimodal Variational Recurrent Neural Network, which perform comparably  ...  ACKNOWLEDGMENTS The authors would like to thank Emma Master, Kira Alqueza, Michael Smith, and Erika Weisz for assistance with the project, and Noah Goodman for discussions about modeling.  ... 
arXiv:1912.05008v1 fatcat:g5tmrhdluvf57nru6mkqrevfjy

Fusion Of Musical Contents, Brain Activity And Short Term Physiological Signals For Music-Emotion Recognition

Jimmy Jarjoura, Sergio Giraldo, Rafael Ramirez
2017 Zenodo  
Additionally, we evaluated the contribution of each audio feature in the classification performance of the multimodal system.  ...  In this study we propose a multi-modal machine learning approach, combining EEG and Audio features for music emotion recognition using a categorical model of emotions.  ...  In the first part, the results of the EEG mood classification will be discussed. In the second part, the multimodal classification will be assessed.  ... 
doi:10.5281/zenodo.1095499 fatcat:neoiqqhvsjf7fkofgoi4dwaqmy

A Systematic Review on Affective Computing: Emotion Models, Databases, and Recent Advances [article]

Yan Wang, Wei Song, Wei Tao, Antonio Liotta, Dawei Yang, Xinlei Li, Shuyong Gao, Yixuan Sun, Weifeng Ge, Wei Zhang, Wenqiang Zhang
2022 arXiv   pre-print
Thus, the fusion of physical information and physiological signals can provide useful features of emotional states and lead to higher accuracy.  ...  Next, we survey and taxonomize state-of-the-art unimodal affect recognition and multimodal affective analysis in terms of their detailed architectures and performances.  ...  The network achieves a classification accuracy of 92.87% and 92.30% for arousal and valence, respectively. Decision-level fusion.  ... 
arXiv:2203.06935v3 fatcat:h4t3omkzjvcejn2kpvxns7n2qe

ECG Pattern Analysis for Emotion Detection

F. Agrafioti, D. Hatzinakos, A. K. Anderson
2012 IEEE Transactions on Affective Computing  
Two experimental setups are presented for the elicitation of active arousal and passive arousal/valence.  ...  This work brings to the table the ECG signal and presents a thorough analysis of its psychological properties.  ...  A multimodal framework was evaluated based on the combination of facial, verbal, gesture, and physiological recordings (HR included).  ... 
doi:10.1109/t-affc.2011.28 fatcat:cyadwtlwqfe6xf6u5yrpuzwa6m

Advances in Multimodal Emotion Recognition Based on Brain–Computer Interfaces

Zhipeng He, Zina Li, Fuzhou Yang, Lei Wang, Jingcong Li, Chengju Zhou, Jiahui Pan
2020 Brain Sciences  
This paper primarily discusses the progress of research into multimodal emotion recognition based on BCI and reviews three types of multimodal affective BCI (aBCI): aBCI based on a combination of behavior  ...  Finally, we identify several important issues and research directions for multimodal emotion recognition based on BCI.  ...  They applied this method separately for the online experiment and achieved accuracies of approximately 68.00% for the valence space and 70.00% for the arousal space after fusion-both of which surpassed  ... 
doi:10.3390/brainsci10100687 pmid:33003397 pmcid:PMC7600724 fatcat:juzx77asgrh2zpl3s2jvw6tdcq

MUSER: MUltimodal Stress Detection using Emotion Recognition as an Auxiliary Task [article]

Yiqun Yao, Michalis Papakostas, Mihai Burzo, Mohamed Abouelenien, Rada Mihalcea
2021 arXiv   pre-print
Evaluations on the Multimodal Stressed Emotion (MuSE) dataset show that our model is effective for stress detection with both internal and external auxiliary tasks, and achieves state-of-the-art results  ...  Although a series of methods have been established for multimodal stress detection, limited steps have been taken to explore the underlying inter-dependence between stress and emotion.  ...  Acknowledgements We would like to thank Cristian-Paul Bara and Mimansa Jaiswal for their helpful discussion on the data processing and features of MuSE dataset.  ... 
arXiv:2105.08146v1 fatcat:3tzug2j5endgdew6wl5iwok5u4

Investigating EEG-Based Functional Connectivity Patterns for Multimodal Emotion Recognition [article]

Xun Wu, Wei-Long Zheng, Bao-Liang Lu
2020 arXiv   pre-print
The classification accuracies of multimodal emotion recognition are 95.08/6.42% on the SEED dataset, 84.51/5.11% on the SEED-V dataset, and 85.34/2.90% and 86.61/3.76% for arousal and valence on the DEAP  ...  In addition, we find that the brain networks constructed with 18 channels achieve comparable performance with that of the 62-channel network in multimodal emotion recognition and enable easier setups for  ...  Multimodal Frameworks As a complex psychological state, emotion is reflected in both physical behaviors and physiological activities [52] [19] .  ... 
arXiv:2004.01973v1 fatcat:g6ozlunmgrdnln4fpoda36d7da

Multimodal Emotion Recognition in Response to Videos

M. Soleymani, M. Pantic, T. Pun
2012 IEEE Transactions on Affective Computing  
The best classification accuracy of 68.5% for three labels of valence and 76.4% for three labels of arousal were obtained using a modality fusion strategy and a support vector machine.  ...  One of the three affective labels of either valence or arousal was determined by classification of bodily responses.  ...  The DLF superior classification rate for arousal and its similar performance for valence classification shows that the proposed emotion classification can replace the self-reporting of single participants  ... 
doi:10.1109/t-affc.2011.37 fatcat:2x5jjpvzwnenfdh3klexyxm7em

EEG-based human emotion recognition using entropy as a feature extraction measure

Pragati Patel, Raghunandan R, Ramesh Naidu Annavarapu
2021 Brain Informatics  
This review aims to give a brief summary of various entropy-based methods used for emotion classification hence providing insights into EEG-based emotion recognition.  ...  Many studies on brain-computer interface (BCI) have sought to understand the emotional state of the user to provide a reliable link between humans and machines.  ...  Acknowledgements The authors acknowledge Pondicherry University for financial support through University Fellowship.  ... 
doi:10.1186/s40708-021-00141-5 pmid:34609639 fatcat:c5qogfsptbhwxjh4otnzdsng6m
« Previous Showing results 1 — 15 out of 885 results