Filters








1,785 Hits in 7.2 sec

How Can AI Recognize Pain and Express Empathy [article]

Siqi Cao, Di Fu, Xu Yang, Pablo Barros, Stefan Wermter, Xun Liu, Haiyan Wu
2021 arXiv   pre-print
The current drive for automated pain recognition is motivated by a growing number of healthcare requirements and demands for social interaction make it increasingly essential.  ...  This article explores the challenges and opportunities of real-world multimodal pain recognition from a psychological, neuroscientific, and artificial intelligence perspective.  ...  A recent study has shown that the combination of CNN and LSTM successfully detects facial expressions associated with pain in a video sequence.  ... 
arXiv:2110.04249v1 fatcat:f6gvjacowfafnfybb2mgoip2ni

MULTIMODAL COMPLEX EMOTIONS: GESTURE EXPRESSIVITY AND BLENDED FACIAL EXPRESSIONS

JEAN-CLAUDE MARTIN, RADOSLAW NIEWIADOMSKI, LAURENCE DEVILLERS, STEPHANIE BUISINE, CATHERINE PELACHAUD
2006 International Journal of Humanoid Robotics  
has been a lot of psychological research on emotion and nonverbal 7 communication in facial expressions, 1 vocal expressions 2-4 and expressive body movements. 5-8 Yet, these psychological studies were  ...  Section 2 summarizes 39 some of the studies on complex emotions, gesture expressivity, and facial expressions.  ...  Acknowledgments 19 This work was partially supported by the Network of Excellence HUMAINE  ... 
doi:10.1142/s0219843606000825 fatcat:f33srovcbrdxtnid2cqcbaxcse

Combining Facial Expressions and Electroencephalography to Enhance Emotion Recognition

Yongrui Huang, Jianhao Yang, Siyu Liu, Jiahui Pan
2019 Future Internet  
In this paper, we adopted a multimodal emotion recognition framework by combining facial expression and EEG, based on a valence-arousal emotional model.  ...  Previous studies have investigated the use of facial expression and electroencephalogram (EEG) signals from single modal for emotion recognition separately, but few have paid attention to a fusion between  ...  In this study, we adopted a multimodal emotion recognition framework by combining facial expression and EEG.  ... 
doi:10.3390/fi11050105 fatcat:ngushp2ec5euxge3kdov5jobru

Detecting expressions with multimodal transformers [article]

Srinivas Parthasarathy, Shiva Sundaram
2020 arXiv   pre-print
Among other cues such as voice activity and gaze, a person's audio-visual expression that includes tone of the voice and facial expression serves as an implicit signal of engagement between parties in  ...  This study investigates deep-learning algorithms for audio-visual detection of user's expression.  ...  as well a fluctuation in physiological signals such as heart rate [11, 12] .  ... 
arXiv:2012.00063v1 fatcat:zbcl4s76zvf4lasf5dz6fcvfrq

Non-contact Emotion Recognition Combining Heart Rate and Facial Expression for Interactive Gaming Environments

Guanglong Du, Shuaiying Long, Hua Yuan
2020 IEEE Access  
In this study, we proposed a method to detect a player's emotions based on heart beat (HR) signals and facial expressions (FE).  ...  Current methods mostly exploit intrusive physiological signals to detect a player's emotions.  ...  Therefore combining facial expressions and physiological signals is the best solution in an interactive gaming environment.  ... 
doi:10.1109/access.2020.2964794 fatcat:7gulu7jal5hcreadu5vovkoo6m

A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions

Zhihong Zeng, M. Pantic, G.I. Roisman, T.S. Huang
2009 IEEE Transactions on Pattern Analysis and Machine Intelligence  
based on facial expressions, head movements, and body gestures.  ...  This paper introduces and surveys these recent advances. We first discuss human emotion perception from a psychological perspective.  ...  ACKNOWLEDGMENTS The authors would like to thank Qiang Ji and the anonymous reviewers for encouragement and valuable comments. This paper is a collaborative work.  ... 
doi:10.1109/tpami.2008.52 pmid:19029545 fatcat:xntouxtsxze3bn3dstdrv5eibi

Virtual Reality-Based Facial Expressions Understanding for Teenagers with Autism [chapter]

Esubalew Bekele, Zhi Zheng, Amy Swanson, Julie Davidson, Zachary Warren, Nilanjan Sarkar
2013 Lecture Notes in Computer Science  
This work proposed a VR-based facial emotion recognition mechanism in the presence of contextual storytelling.  ...  Results from a usability study support the idea that individuals with autism may employ different facial processing strategies.  ...  Introduction Individuals with ASD are known to demonstrate impaired recognition and understanding of emotional facial expressions [1] .  ... 
doi:10.1007/978-3-642-39191-0_50 fatcat:vn7mkemwf5gqlptbsrvjt6bb4e

Multimodal Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expression

Yanjia Sun, Hasan Ayaz, Ali N. Akansu
2020 Brain Sciences  
Human facial expressions are regarded as a vital indicator of one's emotion and intention, and even reveal the state of health and wellbeing.  ...  Experimental results reveal a strong correlation between spontaneous facial affective expressions and the perceived emotional valence.  ...  Acknowledgments: Authors would like to thank Adrian Curtin and Jesse Mark for their help with the paper.  ... 
doi:10.3390/brainsci10020085 pmid:32041316 pmcid:PMC7071625 fatcat:pwtkrpgqvbbvrfzqhigwhzjl7y

A Robotic Framework to Facilitate Sensory Experiences for Children with Autism Spectrum Disorder

Hifza Javed, Rachael Burns, Myounghoon Jeon, Ayanna M. Howard, Chung Hyuk Park
2019 ACM Transactions on Human-Robot Interaction (THRI)  
A preliminary user study was conducted to evaluate the efficacy of the proposed framework, with a total of 18 participants (5 with ASD and 13 typically developing) between the ages of 4 and 12 years.  ...  This paper discusses a novel robot-based framework designed to target sensory difficulties faced by children with ASD in a controlled setting.  ...  movement, speech, facial expression, and physiological signal analyses using unobtrusive sensors to estimate the internal state of a child.  ... 
doi:10.1145/3359613 pmid:33829148 pmcid:PMC8023221 fatcat:vpqgeruembbpnbinp6g5ucmej4

Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments

Esubalew Bekele, Zhi Zheng, Amy Swanson, Julie Crittendon, Zachary Warren, Nilanjan Sarkar
2013 IEEE Transactions on Visualization and Computer Graphics  
In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore  ...  A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed.  ...  We would like to thank all colleagues that helped in evaluating the avatars and the expressions. We give special thanks to all subjects and their family.  ... 
doi:10.1109/tvcg.2013.42 pmid:23428456 pmcid:PMC3867269 fatcat:d2ecqnbzfbgdfbhtyxgf7qut7q

Exploiting facial expressions for affective video summarisation

Hideo Joho, Joemon M. Jose, Roberto Valenti, Nicu Sebe
2009 Proceeding of the ACM International Conference on Image and Video Retrieval - CIVR '09  
A facial expression recognition system was deployed to capture a viewer's face and his/her expressions. The user's facial expressions were analysed to infer personalised affective scenes from videos.  ...  This paper presents an approach to affective video summarisation based on the facial expressions (FX) of viewers.  ...  [11] performed a preliminary study of the role of viewer's physiological states in an attempt to improve data indexing for search and within the search process itself.  ... 
doi:10.1145/1646396.1646435 dblp:conf/civr/JohoJVS09 fatcat:u4uoe2fkjfhihgbufewo7d7osa

Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

Mar Saneiro, Olga C. Santos, Sergio Salmeron-Majadas, Jesus G. Boticario
2014 The Scientific World Journal  
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion  ...  In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning  ...  the members of this project for the enriching discussions carried out, specially Raúl Cabestrero, Pilar Quirós, andÁngeles Manjarrés-Riesco from UNED and the Kinect-related developments made by Miguel  ... 
doi:10.1155/2014/484873 pmid:24892055 pmcid:PMC4030504 fatcat:yn4dqxah6bcm5gzkqgp3itp63u

3D Approaches and Challenges in Facial Expression Recognition Algorithms—A Literature Review

Francesca Nonis, Nicole Dagnes, Federica Marcolin, Enrico Vezzetti
2019 Applied Sciences  
Furthermore, we describe in detail the most used databases to address the problem of facial expressions and emotions, highlighting the results obtained by the various authors.  ...  such limitations is adopting a multimodal 2D+3D analysis.  ...  Marginal studies have been conducted on the combination of 3D facial data and physiological or acoustic cues.  ... 
doi:10.3390/app9183904 fatcat:dvbywdwqyvhp7l5ucac6d5bpue

Advances in Emotion Recognition: Link to Depressive Disorder [chapter]

Xiaotong Cheng, Xiaoxia Wang, Tante Ouyang, Zhengzhi Feng
2020 Mental Disorders [Working Title]  
Emotion recognition enables real-time analysis, tagging, and inference of cognitive affective states from human facial expression, speech and tone, body posture and physiological signal, as well as social  ...  A great amount of features could be extracted from internal and external emotional signals, by calculating their mean, standard deviation, transformation, wave band power and peak detection, and others  ...  Though started from the 1970s, facial expression recognition is the most studied field in natural emotions machine recognition, especially in the USA and Japan, wherein studies on facial expression recognition  ... 
doi:10.5772/intechopen.92019 fatcat:jmss4llbpnfrxcue6bzebsgmby

Affective Body Expression Perception and Recognition: A Survey

Andrea Kleinsmith, Nadia Bianchi-Berthouze
2013 IEEE Transactions on Affective Computing  
Thanks to the decreasing cost of whole-body sensing technology and its increasing reliability, there is an increasing interest in, and understanding of, the role played by body expressions as a powerful  ...  One issue is whether there are universal aspects to affect expression perception and recognition models or if they are affected by human factors such as culture.  ...  Preliminary studies by Pollick and colleagues [54] , [55] examined high, low and neutral saliency facial expressions combined with motion captured arm movements representing knocking motions for angry  ... 
doi:10.1109/t-affc.2012.16 fatcat:g2fbblnnubhj3lgsn62f5mhoxe
« Previous Showing results 1 — 15 out of 1,785 results