A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
EMOPAIN Challenge 2020: Multimodal Pain Evaluation from Facial and Bodily Expressions
[article]
2020
arXiv
pre-print
expressions, pain recognition from multimodal movement, and protective movement behaviour detection. ...
The challenge also aims to encourage the use of the relatively underutilised, albeit vital bodily expression signals for automatic pain and pain-related emotion recognition. ...
Utilising the visual and movement data dimensions, the EmoPain 2020 challenge presents three pain recognition tasks: (i) Pain Estimation from Facial Expressions Task, (ii) Pain Recognition from Multimodal ...
arXiv:2001.07739v3
fatcat:fx5py5fkwfgs7bgrcvud57kksq
EMOPAIN Challenge 2020: Multimodal Pain Evaluation from Facial and Bodily Expressions
2020
2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020)
from facial expressions, pain recognition from multimodal movement, and protective movement behaviour detection. * Corresponding author † These authors made equal contributions 1 https://mvrjustid.github.io ...
The EmoPain 2020 Challenge is the first international competition aimed at creating a uniform platform for the comparison of multi-modal machine learning and multimedia processing methods of chronic pain ...
Utilising the visual and movement data dimensions, the EmoPain 2020 challenge presents three pain recognition tasks: (i) Pain Estimation from Facial Expressions Task, (ii) Pain Recognition from Multimodal ...
doi:10.1109/fg47880.2020.00078
fatcat:3zfntqlfhresxlo56jiymolhf4
The AffectMove 2021 Challenge - Affect Recognition from Naturalistic Movement Data
2021
2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)
Participating teams competed to solve at least one of three tasks based on datasets of different sensors types and reallife problems: multimodal EmoPain dataset for chronic pain physical rehabilitation ...
The AffectMove challenge aimed to take advantage of existing body movement datasets to address key research problems of automatic recognition of naturalistic and complex affective behaviour from this type ...
In the rest of this paper, we review the state of the art in the areas of bodily-expressed affect recognition in the context of pain, learning, and dance, which are relevant to the challenge. ...
doi:10.1109/aciiw52867.2021.9666322
fatcat:xdgxgs3d4jaifoam6rjzevg2uy
Deep-Learning-Based Models for Pain Recognition: A Systematic Review
2020
Applied Sciences
Finally, it provides a discussion of the challenges and open issues. ...
Recently, deep-learning methods have appeared to solve many challenges such as feature selection and cases with a small number of data sets. ...
Acknowledgments: This research project was supported by a grant from the Research Center of the Female Scientific and Medical Colleges, Deanship of Scientific Research, King Saud University. ...
doi:10.3390/app10175984
fatcat:fge2g2pzmfcgnboizubgwflt7y
PLAAN: Pain Level Assessment with Anomaly-detection based Network
2021
Journal on Multimodal User Interfaces
The evaluation of the network is performed on pain intensity estimation and protective behaviour estimation tasks from body movements in the EmoPain Challenge dataset. ...
level of pain and presence or absence of protective behaviour in chronic low back pain patients. ...
We also thank to Emopain challenge organizers for sharing the dataset with us. ...
doi:10.1007/s12193-020-00362-8
fatcat:hwily4z4xjghzkdw2nwv6dyb4e
Bridging the gap between emotion and joint action
[article]
2021
arXiv
pre-print
We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced ...
In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. ...
The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset. IEEE Trans. Affect.Comput. 7, 435-451. ...
arXiv:2108.06264v1
fatcat:zgindzp6ojfnnoij6orggkeuyy
Bridging the gap between emotion and joint action
2021
Neuroscience and Biobehavioral Reviews
We propose an integrative approach to bridge the gap, highlight five research avenues to do so in behavioral neuroscience and digital sciences, and address some of the key challenges in the area faced ...
In this review, we first identify the gap and then stockpile evidence showing strong entanglement between emotion and acting together from various branches of sciences. ...
They show that evaluations made on body expressions rather than on facial expressions lead to more accurate assessment of the affective valence of the situation that triggered such expressions. ...
doi:10.1016/j.neubiorev.2021.08.014
pmid:34418437
fatcat:jg5hmiwpbnfenkt33flrg32hde
Affect recognition & generation in-the-wild
2021
Finally, we present an approach for valence-arousal, or basic expressions' facial affect synthesis. ...
Affect recognition based on a subject's facial expressions has been a topic of major research in the attempt to generate machines that can understand the way subjects feel, act and react. ...
Examples include the collected EmoPain [7] and UNBC-McMaster [136] databases for analysis of pain, the RU-FACS database of subjects participating in a false opinion scenario [15] and the SEMAINE ...
doi:10.25560/87156
fatcat:cuh7si4f7bao7c3jsousrulbla