Filters








648 Hits in 6.6 sec

Automatic Assessment of Depression Based on Visual Cues: A Systematic Review

Anastasia Pampouchidou, Panagiotis Simos, Kostas Marias, Fabrice Meriaudeau, Fan Yang, Matthew Pediaditis, Manolis Tsiknakis
<span title="">2017</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/4bxbdgkmy5ea3prtrfsep6ycym" style="color: black;">IEEE Transactions on Affective Computing</a> </i> &nbsp;
automatic depression assessment utilizing visual cues alone or in combination with vocal or verbal cues.  ...  Automatic depression assessment based on visual cues is a rapidly growing research domain.  ...  [164] presented a deep learning approach for detecting three levels of depression. Finally, Joshi et al.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/taffc.2017.2724035">doi:10.1109/taffc.2017.2724035</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/frjnpewzu5fsfn46gkavkyvlyq">fatcat:frjnpewzu5fsfn46gkavkyvlyq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201108175324/https://ieeexplore.ieee.org/ielx7/5165369/8911290/08052569.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/3d/a1/3da16c95a28d41f4badd0eb82ada959f53906788.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/taffc.2017.2724035"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data

Yuki Matsuda, Dmitrii Fedotov, Yuta Takahashi, Yutaka Arakawa, Keiichi Yasumoto, Wolfgang Minker
<span title="2018-11-15">2018</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/taedaf6aozg7vitz5dpgkojane" style="color: black;">Sensors</a> </i> &nbsp;
We also used existing databases of emotionally rich interactions to train emotion-recognition models and apply them in a cross-corpus fashion to generate emotional-state prediction for the audiovisual  ...  As a typical use case that has a high demand for context awareness but is not tackled widely yet, we focus on the tourism domain.  ...  We used RMSProp [47] as an optimizer with a learning rate of 0.01.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/s18113978">doi:10.3390/s18113978</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/do5tea4eare6hasjk7uie3ymye">fatcat:do5tea4eare6hasjk7uie3ymye</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190224020058/http://pdfs.semanticscholar.org/5a61/f6bade677e16ceb22bb53bffc9542fed9f48.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/5a/61/5a61f6bade677e16ceb22bb53bffc9542fed9f48.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/s18113978"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

A review of affective computing: From unimodal analysis to multimodal fusion

Soujanya Poria, Erik Cambria, Rajiv Bajpai, Amir Hussain
<span title="">2017</span> <i title="Elsevier BV"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/u3qqmkiofjejrnpdxh3hdgssm4" style="color: black;">Information Fusion</a> </i> &nbsp;
With the proliferation of videos posted online (e.g., on YouTube, Facebook, Twitter) for product reviews, movie reviews, political views, and more, affective computing research has increasingly evolved  ...  As part of this review, we carry out an extensive study of different categories of state-of-the-art fusion techniques, followed by a critical analysis of potential performance improvements with multimodal  ...  API that employs deep learning for emotion recognition.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1016/j.inffus.2017.02.003">doi:10.1016/j.inffus.2017.02.003</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ytebhjxlz5bvxcdghg4wxbvr6a">fatcat:ytebhjxlz5bvxcdghg4wxbvr6a</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190430071001/https://dspace.stir.ac.uk/bitstream/1893/25490/1/affective-computing-review.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/77/86/778617c5029256eba82b58921e6a70804524fe6d.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1016/j.inffus.2017.02.003"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> elsevier.com </button> </a>

Wearable affect and stress recognition: A review [article]

Philip Schmidt, Attila Reiss, Robert Duerichen, Kristof Van Laerhoven
<span title="2018-11-21">2018</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Affect recognition aims to detect a person's affective state based on observables, with the goal to e.g. provide reasoning for decision making or support mental wellbeing.  ...  Wearable systems offer an ideal platform for long-term affect recognition applications due to their rich functionality and form factor.  ...  From a healthcare point of view, wearable a ect recognition systems could, for instance, help to ubiquitously monitor the state of patients with mental disorders (e.g. depression).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1811.08854v1">arXiv:1811.08854v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/237d326gljhp5liqfnjet37iny">fatcat:237d326gljhp5liqfnjet37iny</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200906183650/https://arxiv.org/pdf/1811.08854v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/32/a3/32a34207b9ef5f519ba09376b8032252a9b60a42.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1811.08854v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Artificial Intelligence for Suicide Assessment using Audiovisual Cues: A Review [article]

Sahraoui Dhelim, Liming Chen, Huansheng Ning, Chris Nugent
<span title="2022-01-01">2022</span>
Subsequently, we have witnessed fast-growing literature of researches that applies AI to extract audiovisual non-verbal cues for mental illness assessment.  ...  However, the majority of the recent works focus on depression, despite the evident difference between depression signs and suicidal behavior non-verbal cues.  ...  Surprisingly, deep-learning models are rarely used in the literature of suicide detection from audiovisual data, despite their popularity in similar tasks such as depression detection [46] , as deep-learning  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.48550/arxiv.2201.09130">doi:10.48550/arxiv.2201.09130</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/6e3hv7lb5zastcyoe3zm7yuq4i">fatcat:6e3hv7lb5zastcyoe3zm7yuq4i</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220128114117/https://arxiv.org/ftp/arxiv/papers/2201/2201.09130.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/3c/30/3c307c2f467541ca9a730d7f031209cd00ab5bfb.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.48550/arxiv.2201.09130"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

A Systematic Review of Robotic Rehabilitation for Cognitive Training

Fengpei Yuan, Elizabeth Klavon, Ziming Liu, Ruth Palan Lopez, Xiaopeng Zhao
<span title="2021-05-11">2021</span> <i title="Frontiers Media SA"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/t4zwwbshrrfd3hjbg4s3bysm7q" style="color: black;">Frontiers in Robotics and AI</a> </i> &nbsp;
In this article, we carried out a systematic review on recent developments in robot-assisted cognitive training.  ...  We also conducted a meta analysis on the articles that evaluated robot-assisted cognitive training protocol with primary end users (i.e., people with cognitive disability).  ...  To date, a great progress has been made thanks to the advancement of machine learning and deep learning.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3389/frobt.2021.605715">doi:10.3389/frobt.2021.605715</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/34046433">pmid:34046433</a> <a target="_blank" rel="external noopener" href="https://pubmed.ncbi.nlm.nih.gov/PMC8144708/">pmcid:PMC8144708</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/5baghenfgzcjlb2cwx4af5c46q">fatcat:5baghenfgzcjlb2cwx4af5c46q</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210512024922/https://fjfsdata01prod.blob.core.windows.net/articles/files/605715/pubmed-zip/.versions/2/.package-entries/frobt-08-605715-r1/frobt-08-605715.pdf?sv=2018-03-28&amp;sr=b&amp;sig=%2FD8HJFVIuJeLDs5j9odNVzGgJlbdnlRjZnRYHjR6hjM%3D&amp;se=2021-05-12T02%3A49%3A51Z&amp;sp=r&amp;rscd=attachment%3B%20filename%2A%3DUTF-8%27%27frobt-08-605715.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/97/61/9761691d4af86b267d3fb5d13409a8a557457a33.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3389/frobt.2021.605715"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> frontiersin.org </button> </a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8144708" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> pubmed.gov </button> </a>

Leveraging Multi-Modal Sensing for Mobile Health: A Case Review in Chronic Pain

Min S. Hane Aung, Faisal Alquaddoomi, Cheng-Kang Hsieh, Mashfiqui Rabbi, Longqi Yang, J. P. Pollak, Deborah Estrin, Tanzeem Choudhury
<span title="2016-05-09">2016</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/aznf273kcvcbfjcdeghr3xjd6i" style="color: black;">IEEE Journal on Selected Topics in Signal Processing</a> </i> &nbsp;
We present a consolidated discussion on the leveraging of various sensing modalities along with modular server-side and on-device architectures required for this task.  ...  We review examples that deliver actionable information to clinicians and patients while addressing privacy, usability, and computational constraints.  ...  In light of this, we point to the increasingly popular paradigm of deep learning [102] , [103] , which has the capacity to learn feature representations and map low level data to highly abstract categories  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/jstsp.2016.2565381">doi:10.1109/jstsp.2016.2565381</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/30906495">pmid:30906495</a> <a target="_blank" rel="external noopener" href="https://pubmed.ncbi.nlm.nih.gov/PMC6430587/">pmcid:PMC6430587</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/vxvj3c23bngnbntwrk3pev5cpq">fatcat:vxvj3c23bngnbntwrk3pev5cpq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200306212347/https://ueaeprints.uea.ac.uk/id/eprint/72396/1/Accepted_Manuscript.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/d2/14/d214165e168125c7ccc2260d2244043eecb261f2.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/jstsp.2016.2565381"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6430587" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> pubmed.gov </button> </a>

REVIEW Effects of memantine treatment on language abilities and functional communication: A review of data [chapter]

<span title="2016-04-14">2016</span> <i title="Routledge"> Pharmacology and Aphasia </i> &nbsp;
This book is essential reading for anyone interested in the rehabilitation of aphasia and related cognitive disorders. This book was originally published as a special issue of Aphasiology.  ...  Leaders in the field provide tutorial reviews on how focal brain injury and degeneration impact on the normal activity of different neurotransmitter systems and how drugs combined or not with rehabilitation  ...  A picture cue with a written label was produced for a number of nouns, verbs and adjectives which could be combined to build sensible clauses.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.4324/9781315755816-18">doi:10.4324/9781315755816-18</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/pikwmmwmozcgtdxwyoqlwtdhly">fatcat:pikwmmwmozcgtdxwyoqlwtdhly</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201229150407/https://s3-euw1-ap-pe-ws4-capi2-distribution-p.s3-eu-west-1.amazonaws.com/books/9781315755816/HUBPMP/9781315755816_googleScholarPDF.pdf?response-content-disposition=attachment%3B%20filename%3D%229781315755816_googlepreview.pdf%22&amp;response-content-type=application%2Fpdf&amp;x-amz-security-token=IQoJb3JpZ2luX2VjEKf%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCWV1LXdlc3QtMSJHMEUCIQDl7Unqk8GW3vKbfXydYSZ6Z5EX8hf%2ByWBzQZxaB4JxhAIgCQ19wix%2B7hjr7ZcZYbDNzMNyNQR26j3nqOV4O8%2BcdmsqtAMIXxACGgwwMTIxNzcyNjQ1MTEiDOwRxg%2BWx%2B1oI2yaqyqRAyF4iiMMvwv%2BFDwesKV%2BHLy%2BPQqJeraZ1TscrPL4LoJC0ryynqB84ihrF6VZXibFzROj1XFtAlqtmlTEivzUu2ZSQlV8%2FNKX90ETB0rdO9%2BgJr0khhIj1CTzwQ7Q2xl%2B4TyNqS%2BITD7OxBhI1eF0FOYMxQKuNEeIJ%2B4e0sfjyv0KhpUnQyf%2Fqy30IHcri05xvGIqoz10Ynzbcp5BLyz99wNaobsDBd4LQspPGbG38dOThvDZT8L0znwOpV8j74OdQmHYcurdAy9lwwKie8D7UN%2FLDqu2O4mYJ5jRiE7vO4AmMsymcDgQ5RdpwDAH9I5Z%2FloP28C6fZYu41TkUflyTWEQ7edSHUUaNOdRNvbEpscrLbfQRZTh8zW6PC67DNO%2Fnon4k44pi79KqOL7DSVijii4j%2BSjHRSoDF7XQ9HD25Lr2s8%2F34bbTqYNLiZGw%2F6ZYK47AAOK1nwN3%2BCF3qew4fxUoRlrWaMsN5a0Y4AFp40tB63AUFPJcfLi8b66R26cLnlOSlEPMl0aXattCl19ghUtMJP1rP8FOusBb%2Fl%2BIY2mWvh1%2F%2Fbn9%2B1ZpbS8kCxsqeCsil%2BBvkwbmw%2FO9VfX7KXXfSzqSvCqRh0E7FbmT6FRRHukg9CRy9OOp7UKmkaIvR9qr8YW0aZtpv6jNbRklc6h26YuU95WhEiuEZ%2BL%2BTTOetntG0jncwxE7XI%2FDX7XPxq4mcD2Zi8XGZ%2FWBwl4x4HbiZlOBpnsgZkFo4DxouiRKtTSoQ2%2FCxStMQRtoUsPkyk3Kc%2FlVtm0H0G02%2BjAPXYOrySiQaLDaWK53AWBZraFrE9rsvVdWQ%2B9oV00o%2FZI29yJMMHYh7%2Fc1QPQ1TU8LTeJO%2FiVpg%3D%3D&amp;AWSAccessKeyId=ASIAQFVOSJ573LLMZLEY&amp;Expires=1609859046&amp;Signature=TTuF7sixwt2FYAtspkuFuoDRmfw%3D" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/5a/a3/5aa37b2ae19dfed311b41c469bd69c8b231fc939.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.4324/9781315755816-18"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Technologies for Multimodal Interaction in Extended Reality—A Scoping Review

Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, Roope Raisamo
<span title="2021-12-10">2021</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/bn3b3ia5mzg4nbdativq7ki3sy" style="color: black;">Multimodal Technologies and Interaction</a> </i> &nbsp;
We conclude with our perspective on promising research avenues for multimodal interaction technologies.  ...  This scoping review summarized recent advances in multimodal interaction technologies for head-mounted display-based (HMD) XR systems.  ...  They also discussed deep-learning-based methods. Vuletic et al. [50] carried out a review of hand gestures used in HCI. Chen et al.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/mti5120081">doi:10.3390/mti5120081</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/3oea4tqlwfcojfvnohv72id7i4">fatcat:3oea4tqlwfcojfvnohv72id7i4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220206205001/https://mdpi-res.com/d_attachment/mti/mti-05-00081/article_deploy/mti-05-00081-v3.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/e9/15/e9159c0b61b308005c6dbd95df787fd6fc199055.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/mti5120081"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

Findings about LORETA Applied to High-Density EEG—A Review

Serena Dattola, Francesco Carlo Morabito, Nadia Mammone, Fabio La Foresta
<span title="">2020</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/ikdpfme5h5egvnwtvvtjrnntyy" style="color: black;">Electronics</a> </i> &nbsp;
Electroencephalography (EEG) is a non-invasive diagnostic technique for recording brain electric activity.  ...  In this way, the combination of LORETA with HD-EEGs could become an even more valuable tool for noninvasive clinical evaluation in the field of applied neuroscience.  ...  with a more recurrent depressive illness course.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/electronics9040660">doi:10.3390/electronics9040660</a> <a target="_blank" rel="external noopener" href="https://doaj.org/article/2d5016c346ce4dd79d5bfade99c4be1d">doaj:2d5016c346ce4dd79d5bfade99c4be1d</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/unnkjbasxjbdfhsz7en5fxxqa4">fatcat:unnkjbasxjbdfhsz7en5fxxqa4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200508193755/https://res.mdpi.com/d_attachment/electronics/electronics-09-00660/article_deploy/electronics-09-00660.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/97/f7/97f73b659213d97bc0643a8960a2c66b14d3672a.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/electronics9040660"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

First Impressions: A Survey on Vision-Based Apparent Personality Trait Analysis [article]

Julio C. S. Jacques Junior, Yağmur Güçlütürk, Marc Pérez, Umut Güçlü, Carlos Andujar, Xavier Baró, Hugo Jair Escalante, Isabelle Guyon, Marcel A. J. van Gerven, Rob van Lier, Sergio Escalera
<span title="2019-07-17">2019</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
approaches for apparent personality trait recognition.  ...  From the computational point of view, by far speech and text have been the most considered cues of information for analyzing personality.  ...  We thank ChaLearn Looking at People sponsors for their support, including Microsoft Research, Google, NVIDIA Corporation, Amazon, Facebook and Disney Research.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1804.08046v3">arXiv:1804.08046v3</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/bqsfh2vdcjgtveo5qnw5duppeq">fatcat:bqsfh2vdcjgtveo5qnw5duppeq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20191222003524/https://arxiv.org/pdf/1804.08046v3.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/c3/27/c327a967854cb8dd9ddc35b2eaeb1647ac6b340e.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1804.08046v3" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Crossmodal Processing in the Human Brain: Insights from Functional Neuroimaging Studies

G. A. Calvert
<span title="2001-12-01">2001</span> <i title="Oxford University Press (OUP)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/eeg67t2wzfd3dpiicxsttg3cxi" style="color: black;">Cerebral Cortex</a> </i> &nbsp;
of content and spatial information, and crossmodal learning.  ...  The different analytic strategies adopted by different groups may also be a significant factor contributing to the variability in findings.  ...  By contrast, spatially disparate crossmodal cues can produce a profound response depression.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1093/cercor/11.12.1110">doi:10.1093/cercor/11.12.1110</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/11709482">pmid:11709482</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/efd4qlyetzbsbbbtfadon4vs4e">fatcat:efd4qlyetzbsbbbtfadon4vs4e</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200120183024/https://watermark.silverchair.com/1101110.pdf?token=AQECAHi208BE49Ooan9kkhW_Ercy7Dm3ZL_9Cf3qfKAc485ysgAAAmUwggJhBgkqhkiG9w0BBwagggJSMIICTgIBADCCAkcGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMIjZ42oaDwzJvMMWnAgEQgIICGDIG-cvMRHKX6MQvOWiajmeDghxD64x9NP0PTasr24mO3gdaeTk6I0uAo-g61a2afekUnyXacSwF0oMWt-LwM1sldrBLSg3jNQdE5Q6UvlDanZCDXIfcEUjyqjpqdR-wkBRNGvrSrx82qAwb_RSkvyJKSp6cCP7mQpYpnU1KQ_CIpqHjobMz5JWH0q81nweYD8pJEHHF5tzX2CTDyn-IRxq95kBY4lgNTwnn-OU3ff_FiEkuVeVIci5dIEAxEeqBETA2rfNWevIXQ2edrAG7GhHcdXpmd03rLSogEoPyUovnklAtu9nNwoLMVk8yYHiyggKkyLM-Qh50inCUmfjRi_6tNTbf5h2z50-Iy1VKGnoza3RXG-IwGdU1IZFsvPddPU-PFOzdPGzbYRSu8Asf-X4YxwnUZaC_QOsKxKmFYdyGLBsOzN1AJj3lV1OHVu8x_erLrL_B0c7OCU_W_xYm-zT2UzIO3EFAh9NkhAgzqgfb6tjKaGc7XdnP52xhyHinexXk2iugJilGSpRskYdxfoFcUJ4wJjPcz8faqjlv6q0OICJbfuTONFhv8a-DHFl9MTEjbxVQH0Zn5eZETKnvRPXjLYNorhdzBjsDkvcaqdui2v5YrA4-sBNZ2-8_vTqXOhoo5nzcNwvKHuppcnP8uinxk7U0_SWalVck8M-w2Az9z6FFM_OuUL59ZAz9VlzY5FVf5ElZ0bjW" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/11/bd/11bdf3b23cfda3a8947a5c019b4045840a4396b6.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1093/cercor/11.12.1110"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> oup.com </button> </a>

Toward a Neural Basis of Music Perception – A Review and Updated Model

Stefan Koelsch
<span title="">2011</span> <i title="Frontiers Media SA"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/5r5ojcju2repjbmmjeu5oyawti" style="color: black;">Frontiers in Psychology</a> </i> &nbsp;
-the model presented here overlaps with models for language processing (for a discussion on the term modularity see also Fodor et al., 1991) .  ...  The following sections will review research findings about the workings of these modules, thus synthesizing current knowledge into a framework for neuroscientific research in the field of music perception  ...  For a review see LeDoux (2000) . 2 For a study using FFRs to investigate effects of musical training on audiovisual integration of music as well as speech see Musacchia et al. (2007) ; for a study on  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3389/fpsyg.2011.00110">doi:10.3389/fpsyg.2011.00110</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/21713060">pmid:21713060</a> <a target="_blank" rel="external noopener" href="https://pubmed.ncbi.nlm.nih.gov/PMC3114071/">pmcid:PMC3114071</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/niqydbu7jzhr3ak6uliqxngfgq">fatcat:niqydbu7jzhr3ak6uliqxngfgq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170830001547/https://fjfsdata01prod.blob.core.windows.net/articles/files/10658/pubmed-zip/.versions/1/.package-entries/fpsyg-02-00110/fpsyg-02-00110.pdf?sv=2015-12-11&amp;sr=b&amp;sig=FCUXVAhQUFpqNWZg40sqPj%2BnBzKDMajlMlVWcg7Kyeo%3D&amp;se=2017-08-30T00%3A16%3A07Z&amp;sp=r&amp;rscd=attachment%3B%20filename%2A%3DUTF-8%27%27fpsyg-02-00110.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/cf/f4/cff43e6709a3cbb54f7cb09b32132e95691a8572.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3389/fpsyg.2011.00110"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> frontiersin.org </button> </a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3114071" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> pubmed.gov </button> </a>

Continuous Analysis of Affect from Voice and Face [chapter]

Hatice Gunes, Mihalis A. Nicolaou, Maja Pantic
<span title="">2011</span> <i title="Springer London"> Computer Analysis of Human Behavior </i> &nbsp;
on open issues and new challenges in the field, and (ii) introduce a representative approach for H.  ...  and emotions are complex constructs, with fuzzy boundaries and with substantial individual differences in expression and experience [7] .  ...  Kanluan et al. combine audio and visual cues for affect recognition in V-A space by fusing facial expression and audio cues, using Support Vector Machines for Regression (SVR) and late fusion with a weighted  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1007/978-0-85729-994-9_10">doi:10.1007/978-0-85729-994-9_10</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/2awlffumuvcp3fs6e75ltokdeq">fatcat:2awlffumuvcp3fs6e75ltokdeq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20130108011250/http://www.eecs.qmul.ac.uk/~hatice/GunesEtAl_CAHB2011.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/eb/88/eb8887dd44392d804f37e7268bf32f142123e0b0.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1007/978-0-85729-994-9_10"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> springer.com </button> </a>

A Bellwether for All Library Services in the Future: A Review of User-Centered Library Integrations with Learning Management Systems

Liz Thompson, David S. Vess
<span title="2017-08-17">2017</span> <i title="Virginia Tech Libraries"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/blxws25cpvakvjtjfmrppjpeiq" style="color: black;">Virginia Libraries</a> </i> &nbsp;
expectations, and perceptions of library integrations are the foundation for a successful integration.  ...  of library integrations with the LMS.  ...  is moving informal learning recognition into practice.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.21061/valib.v62i1.1472">doi:10.21061/valib.v62i1.1472</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/3m4u2ggkpvh5pcjvc3t6otpctm">fatcat:3m4u2ggkpvh5pcjvc3t6otpctm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190428082727/http://cdn.nmc.org/media/2016-nmc-horizon-report-he-EN.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/41/de/41deb1fd41202051951cc33444c6adfe4495fe9d.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.21061/valib.v62i1.1472"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>
&laquo; Previous Showing results 1 &mdash; 15 out of 648 results