Filters








2,277 Hits in 8.6 sec

EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data

Yuki Matsuda, Dmitrii Fedotov, Yuta Takahashi, Yutaka Arakawa, Keiichi Yasumoto, Wolfgang Minker
<span title="2018-11-15">2018</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/taedaf6aozg7vitz5dpgkojane" style="color: black;">Sensors</a> </i> &nbsp;
In addition, we found that effective features used for emotion and satisfaction estimation are different among tourists with different cultural backgrounds.  ...  As tourist actions, behavioral cues (eye and head/body movement) and audiovisual data (facial/vocal expressions) were collected during sightseeing using an eye-gaze tracker, physical-activity sensors,  ...  The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of the data; in the writing of the manuscript; and in the decision to publish the results.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/s18113978">doi:10.3390/s18113978</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/do5tea4eare6hasjk7uie3ymye">fatcat:do5tea4eare6hasjk7uie3ymye</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190224020058/http://pdfs.semanticscholar.org/5a61/f6bade677e16ceb22bb53bffc9542fed9f48.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/5a/61/5a61f6bade677e16ceb22bb53bffc9542fed9f48.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/s18113978"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

Affective Processes: stochastic modelling of temporal context for emotion and facial expression recognition [article]

Enrique Sanchez and Mani Kumar Tellamekala and Michel Valstar and Georgios Tzimiropoulos
<span title="2021-03-24">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
We validate our approach on four databases, two for Valence and Arousal estimation (SEWA and AffWild2), and two for Action Unit intensity estimation (DISFA and BP4D).  ...  with a global latent variable model; (b) temporal context modelling using task-specific predictions in addition to features; and (c) smart temporal context selection.  ...  The views represented are the views of the authors alone and do not necessarily represent the views of the Department of Health in England, NHS, or the National Institute for Health Research.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2103.13372v1">arXiv:2103.13372v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/auaeihk46vekno2rshypknxqpa">fatcat:auaeihk46vekno2rshypknxqpa</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210403232500/https://arxiv.org/pdf/2103.13372v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/50/18/501873671909a43c2cc4fa2ac0af724a2088840f.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2103.13372v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-Related Applications

Ciprian Adrian Corneanu, Marc Oliu Simon, Jeffrey F. Cohn, Sergio Escalera Guerrero
<span title="2016-08-01">2016</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/3px634ph3vhrtmtuip6xznraqi" style="color: black;">IEEE Transactions on Pattern Analysis and Machine Intelligence</a> </i> &nbsp;
Building a system capable of automatically recognizing facial expressions from images and video has been an intense field of study in recent years.  ...  We define a new taxonomy for the field, encompassing all steps from face detection to facial expression recognition, and describe and classify the state of the art methods accordingly.  ...  On the other hand [49] combines shape with global and local appearance features for continuous AU intensity estimation and continuous pain intensity estimation.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tpami.2016.2515606">doi:10.1109/tpami.2016.2515606</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/26761193">pmid:26761193</a> <a target="_blank" rel="external noopener" href="https://pubmed.ncbi.nlm.nih.gov/PMC7426891/">pmcid:PMC7426891</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ezwkw2bmhbdtlffz3uz3m3hoiy">fatcat:ezwkw2bmhbdtlffz3uz3m3hoiy</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170517131900/http://www.pitt.edu:80/~jeffcohn/biblio/Survey.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/ce/06/ce067987ec8d529d1a47f5c92161bea07332c018.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tpami.2016.2515606"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7426891" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> pubmed.gov </button> </a>

Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-related Applications [article]

Ciprian Corneanu, Marc Oliu, Jeffrey F. Cohn, Sergio Escalera
<span title="2016-06-10">2016</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Building a system capable of automatically recognizing facial expressions from images and video has been an intense field of study in recent years.  ...  We define a new taxonomy for the field, encompassing all steps from face detection to facial expression recognition, and describe and classify the state of the art methods accordingly.  ...  On the other hand [49] combines shape with global and local appearance features for continuous AU intensity estimation and continuous pain intensity estimation.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1606.03237v1">arXiv:1606.03237v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/t55kncgy6fgsvgi42pdgleu43m">fatcat:t55kncgy6fgsvgi42pdgleu43m</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200827004658/https://arxiv.org/pdf/1606.03237v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/85/8d/858ddff549ae0a3094c747fb1f26aa72821374ec.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1606.03237v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Facial Expression Analysis under Partial Occlusion

Ligang Zhang, Brijesh Verma, Dian Tjondronegoro, Vinod Chandran
<span title="2018-04-18">2018</span> <i title="Association for Computing Machinery (ACM)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/eiea26iqqjcatatlgxdpzt637y" style="color: black;">ACM Computing Surveys</a> </i> &nbsp;
The context is right for a comprehensive perspective of these developments and the state of the art from this perspective.  ...  Automatic machine-based Facial Expression Analysis (FEA) has made substantial progress in the past few decades driven by its importance for applications in psychology, security, health, entertainment and  ...  Another work was done by Ekman and his colleagues [Ekman 1978] , who designed the Facial Action Coding System (FACS) to encode the states of facial expressions using facial Action Units (AUs).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/3158369">doi:10.1145/3158369</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/xmfmw7z275hb7e6dbazjqm5fui">fatcat:xmfmw7z275hb7e6dbazjqm5fui</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200923070501/https://arxiv.org/ftp/arxiv/papers/1802/1802.08784.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/88/ad/88ad82e6f2264f75f7783232ba9185a2f931a5d1.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1145/3158369"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> acm.org </button> </a>

Toward an affect-sensitive multimodal human-computer interaction

M. Pantic, L.J.M. Rothkrantz
<span title="">2003</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/yfvtieuumfamvmjlc255uckdlm" style="color: black;">Proceedings of the IEEE</a> </i> &nbsp;
Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions).  ...  In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort.  ...  Koppelaar, and anonymous reviewers for their helpful comments and suggestions.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/jproc.2003.817122">doi:10.1109/jproc.2003.817122</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/a2mq7h2lwzepnjexicqr3spc2m">fatcat:a2mq7h2lwzepnjexicqr3spc2m</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170809042954/http://www.kbs.twi.tudelft.nl/docs/journal/Pantic.M-ProcIEEE2003.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/1d/ff/1dff919e51c262c22630955972968f38ba385d8a.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/jproc.2003.817122"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Learning Pain from Action Unit Combinations: A Weakly Supervised Approach via Multiple Instance Learning [article]

Zhanli Chen, Rashid Ansari, Diana J. Wilkie
<span title="2018-02-20">2018</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Patient pain can be detected highly reliably from facial expressions using a set of facial muscle-based action units (AUs) defined by the Facial Action Coding System (FACS).  ...  on combinations or they seek to bypass AU detection by training a binary pain classifier directly on pain intensity data but are limited by lack of enough labeled data for satisfactory training.  ...  Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the National Institute of Nursing Research.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1712.01496v2">arXiv:1712.01496v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/dp6r3kng5zhxdce5tfxihumw5a">fatcat:dp6r3kng5zhxdce5tfxihumw5a</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200915151540/https://arxiv.org/pdf/1712.01496v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/cc/8f/cc8ff69db9e90579c83b41e94a887023c1850608.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1712.01496v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Machine-based Multimodal Pain Assessment Tool for Infants: A Review [article]

Ghada Zamzmi, Dmitry Goldgof, Rangachar Kasturi, Yu Sun, Terri Ashmeade
<span title="2019-01-16">2019</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
This paper comprehensively reviews the automated approaches (i.e., approaches to feature extraction) for analyzing infants' pain and the current efforts in automatic pain recognition.  ...  The intermittent and inconsistent assessment can induce poor treatment and, therefore, cause serious long-term consequences.  ...  ACKNOWLEDGMENT Many thanks to the anonymous reviewers whose insightful feedback and constructive suggestions helped in shaping this article into its present form.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1607.00331v3">arXiv:1607.00331v3</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/fyyjk7wpvfeztmdy3macyga2ea">fatcat:fyyjk7wpvfeztmdy3macyga2ea</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200823180347/https://arxiv.org/pdf/1607.00331v3.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/98/ca/98cadc8b5982985ffc8e543d5680fe6666079651.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1607.00331v3" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

How Can AI Recognize Pain and Express Empathy [article]

Siqi Cao, Di Fu, Xu Yang, Pablo Barros, Stefan Wermter, Xun Liu, Haiyan Wu
<span title="2021-10-08">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
How can we create an AI agent with proactive and reactive empathy?  ...  The current drive for automated pain recognition is motivated by a growing number of healthcare requirements and demands for social interaction make it increasingly essential.  ...  For now, whether to choose a direct (direct feature selection based on raw data) or indirect approach (body models of specific body action units mapping to pain like PSPI) is unstudied.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2110.04249v1">arXiv:2110.04249v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/f6gvjacowfafnfybb2mgoip2ni">fatcat:f6gvjacowfafnfybb2mgoip2ni</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20211011102609/https://arxiv.org/ftp/arxiv/papers/2110/2110.04249.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/75/13/7513d6c68f3874c9a627d43ae9d831db279f1766.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2110.04249v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

3D facial expression recognition: A perspective on promises and challenges

T. Fang, X. Zhao, O. Ocegueda, S. K. Shah, I. A. Kakadiaris
<span title="">2011</span> <i title="IEEE"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/kpdlnoy5crhxtiv7j3acen4kje" style="color: black;">Face and Gesture 2011</a> </i> &nbsp;
This survey focuses on discrete expression classification and facial action unit recognition performed using 3D face data, possibly including a corresponding 2D texture image.  ...  We also call for standardized experimental protocols in order to draw fair and meaningful comparisons between different systems.  ...  *This research was funded in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), through the Army Research Laboratory (ARL) and  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/fg.2011.5771466">doi:10.1109/fg.2011.5771466</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/fgr/FangZOSK11.html">dblp:conf/fgr/FangZOSK11</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/x3jpeexcxbfvzo4sbfexi6if2q">fatcat:x3jpeexcxbfvzo4sbfexi6if2q</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170809050255/http://www.cimat.mx/~omar/papers/3DFEA-FG.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/6e/fa/6efaf95a399f07205ae6448d0c745510cc58bbcb.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/fg.2011.5771466"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Multimodal Affect Recognition: Current Approaches and Challenges [chapter]

Hussein Al Osman, Tiago H. Falk
<span title="2017-02-08">2017</span> <i title="InTech"> Emotion and Attention Recognition Based on Biological Signals and Images </i> &nbsp;
However, the multimodal approach presents challenges pertaining to the fusion of individual signals, dimensionality of the feature space, and incompatibility of collected signals in terms of time resolution  ...  Finally, we summarize the current challenges in the field and provide ideas for future research directions.  ...  predefined action units (AUs).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.5772/65683">doi:10.5772/65683</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/du7u2lfx4nhkzf5d7zq7g5ofty">fatcat:du7u2lfx4nhkzf5d7zq7g5ofty</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190502043607/https://cdn.intechopen.com/pdfs/52941.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/c3/1a/c31ac833cc56e32b971e54c037b404c80c9ca9cd.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.5772/65683"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Introducing Representations of Facial Affect in Automated Multimodal Deception Detection [article]

Leena Mathur, Maja J Matarić
<span title="2020-08-31">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
We experimented with unimodal Support Vector Machines (SVM) and SVM-based multimodal fusion methods to identify effective features, modalities, and modeling approaches for detecting deception.  ...  This paper presents a novel analysis of the discriminative power of dimensional representations of facial affect for automated deception detection, along with interpretable features from visual, vocal,  ...  Visual The OpenFace 2.2.0 toolkit [6] was used to extract visual features capturing facial action units (FAUs), eye gaze, and head pose from each facial frame of each video.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2008.13369v1">arXiv:2008.13369v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/di7bka4lijaczlw6roba6yam6e">fatcat:di7bka4lijaczlw6roba6yam6e</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200902230855/https://arxiv.org/ftp/arxiv/papers/2008/2008.13369.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2008.13369v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

2020 Index IEEE Transactions on Image Processing Vol. 29

<span title="">2020</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/dhlhr4jqkbcmdbua2ca45o7kru" style="color: black;">IEEE Transactions on Image Processing</a> </i> &nbsp;
., +, TIP 2020 2328-2343 Mutual Context Network for Jointly Estimating Egocentric Gaze and Action.  ...  ., +, TIP 2020 225-236 Collective Affinity Learning for Partial Cross-Modal Hashing. Guo, J., +, TIP 2020 1344-1355 Context-Aware Graph Label Propagation Network for Saliency Detection.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tip.2020.3046056">doi:10.1109/tip.2020.3046056</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/24m6k2elprf2nfmucbjzhvzk3m">fatcat:24m6k2elprf2nfmucbjzhvzk3m</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201224144031/https://ieeexplore.ieee.org/ielx7/83/8835130/09301460.pdf?tp=&amp;arnumber=9301460&amp;isnumber=8835130&amp;ref=" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/56/93/5693eebc307c33915511489f6dcddcb127981534.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tip.2020.3046056"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

A Review of Human Activity Recognition Methods

Michalis Vrigkas, Christophoros Nikou, Ioannis A. Kakadiaris
<span title="2015-11-16">2015</span> <i title="Frontiers Media SA"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/t4zwwbshrrfd3hjbg4s3bysm7q" style="color: black;">Frontiers in Robotics and AI</a> </i> &nbsp;
Recognizing human activities from video sequences or still images is a challenging task due to problems, such as background clutter, partial occlusion, changes in scale, viewpoint, lighting, and appearance  ...  Many applications, including video surveillance systems, human-computer interaction, and robotics for human behavior characterization, require a multiple activity recognition system.  ...  ACKNOWLEDGMENTS This research was funded in part by the UH Hugh Roy and Lillie Cranz Cullen Endowment Fund.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3389/frobt.2015.00028">doi:10.3389/frobt.2015.00028</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ywzq5ej2gbhatg62sp46t3usgi">fatcat:ywzq5ej2gbhatg62sp46t3usgi</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170829143108/http://cbl.uh.edu/pub_files/J4_Vrigkas_FRONTIERS_2015.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/90/a7/90a754f597958a2717862fbaa313f67b25083bf9.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3389/frobt.2015.00028"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> frontiersin.org </button> </a>

Facial Expression Recognition: A Review of Trends and Techniques

Olufisayo Ekundayo, Serestina Viriri
<span title="">2021</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/q7qi7j4ckfac7ehf3mjbso4hne" style="color: black;">IEEE Access</a> </i> &nbsp;
None of the works considers intensity estimation of an emotion; neither do they include studies that address data annotation inconsistencies and correlation among labels in their works.  ...  Facial Expression Recognition (FER) is presently the aspect of cognitive and affective computing with the most attention and popularity, aided by its vast application areas.  ...  [173] showed that CNN learned features correspond to Facial Action Units (FACs).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/access.2021.3113464">doi:10.1109/access.2021.3113464</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/hapy6t6ohneupiwh7meakzk3ma">fatcat:hapy6t6ohneupiwh7meakzk3ma</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20211008180240/https://ieeexplore.ieee.org/ielx7/6287639/6514899/09540650.pdf?tp=&amp;arnumber=9540650&amp;isnumber=6514899&amp;ref=" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/c1/61/c1619734e39be929f42524e6acc45e4e3e110608.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/access.2021.3113464"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> ieee.com </button> </a>
&laquo; Previous Showing results 1 &mdash; 15 out of 2,277 results