34,121 Hits in 10.7 sec

Exploration of Affect Sensing from Speech and Metaphorical Text [chapter]

Li Zhang
2009 Lecture Notes in Computer Science  
The detected affective states from text also play an important role in producing emotional animation for users' avatars. Evaluation of the affect detection from speech and text is provided.  ...  We report new developments on affect detection from textual metaphorical affective expression and affect sensing from speech.  ...  An example of real-time interaction In this section, we report the framework of our application and how to employ the detected affect from text to activate emotional animation for users' avatars.  ... 
doi:10.1007/978-3-642-03364-3_31 fatcat:iojivos54vcx5bu4zoehpur2fi

Recognition of Emotion from Speech: A Review [chapter]

S. Ramakrishnan
2012 Speech Enhancement, Modeling and Recognition- Algorithms and Applications  
significant to real-time detection.  ...  Computer Games: Computer games can be controlled through emotions of human speech. The computer recognizes human emotion from their speech and compute the level of game (easy, medium, hard).  ... 
doi:10.5772/39246 fatcat:zpfbbxcaj5b5lnayo2peznx7pm

Survey on AI-Based Multimodal Methods for Emotion Detection [chapter]

Catherine Marechal, Dariusz Mikołajewski, Krzysztof Tyburek, Piotr Prokopowicz, Lamine Bougueroua, Corinne Ancourt, Katarzyna Węgrzyn-Wolska
2019 Msphere  
We focus on the impact of emotion analysis and state of the arts of multimodal emotion detection.  ...  Quick and accurate emotion recognition may increase possibilities of computers, robots, and integrated environments to recognize human emotions, and response accordingly to them a social rules.  ...  Many research has been conducted to enable machines detect and understand human affective states, such as emotions, interests and the behavior.  ... 
doi:10.1007/978-3-030-16272-6_11 fatcat:fzyhqntc7rb3fdsic4fzsippzq

A Multimodal Emotion Sensing Platform for Building Emotion-Aware Applications [article]

Daniel McDuff, Kael Rowan, Piali Choudhury, Jessica Wolk, ThuVan Pham, Mary Czerwinski
2019 arXiv   pre-print
We hope that this platform helps advance the state-of-the-art in affective computing by enabling development of novel human-computer interfaces.  ...  Humans use a host of signals to infer the emotional state of others. In general, computer systems that leverage signals from multiple modalities will be more robust and accurate in the same task.  ...  Acknowledgments The authors would like to thank Michael Gamon, Mark Encarnacion, Ivan Tashev, Cha Zhang, Emad Barsoum, Dan Bohus and Nick Saw for the contribution of models and PSI components that are  ... 
arXiv:1903.12133v1 fatcat:xn33rcxypzg33gv4fwzpltpk2i

Text and Voice Input to Emotional Analysis for AI Clients

Simran Bhake
2020 International Journal for Research in Applied Science and Engineering Technology  
Detecting Emotional state of a person can be challenging but also very crucial. Emotions can be detected through various ways from human i.e facial expressions, gestures, speech and written text.  ...  In order to make communication and interaction with machine as real as possible, emotional status of humans must be recognized by Machine.  ...  utterances are fed into SVM along with their class labels. c) Developing SVM model for each emotional state using feature values. d) Taking real-time input and comparing with predicted values.  ... 
doi:10.22214/ijraset.2020.1042 fatcat:xu6b7hsf6befrgxhlo2iaiinkq

CASIE – Computing affect and social intelligence for healthcare in an ethical and trustworthy manner

Laurentiu Vasiliu, Keith Cortis, Ross McDermott, Aphra Kerr, Arne Peters, Marc Hesse, Jens Hagemeyer, Tony Belpaeme, John McDonald, Rudi Villing, Alessandra Mileo, Annalina Capulto (+4 others)
2021 Paladyn: Journal of Behavioral Robotics  
We propose an architecture that addresses a wide range of social cooperation skills and features required for real human–robot social interaction, which includes language and vision analysis, dynamic emotional  ...  capabilities, for contextually appropriate robot behaviours and cooperative social human–robot interaction for the healthcare domain.  ...  Conflict of interest: Authors state no conflict of interest.  ... 
doi:10.1515/pjbr-2021-0026 fatcat:h63j74p7rzh45eh4bw7quxovwu

SceneMaker: Creative Technology for Digital StoryTelling [chapter]

Murat Akser, Brian Bridges, Giuliano Campo, Abbas Cheddad, Kevin Curran, Lisa Fitzpatrick, Linley Hamilton, John Harding, Ted Leath, Tom Lunney, Frank Lyons, Minhua Ma (+12 others)
2017 Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering  
SceneMaker will highlight affective or emotional content in digital storytelling with particular focus on character body posture, facial expressions, speech, non-speech audio, scene composition, timing  ...  We propose here the development of a flagship computer software platform, SceneMaker, acting as a digital laboratory workbench for integrating and experimenting with the computer processing of new theories  ...  Detecting Emotions and Personality in Film/Play Scripts All modalities of human interaction express emotional states and personality, such as voice, word choice, gesture, body posture and facial expression  ... 
doi:10.1007/978-3-319-55834-9_4 fatcat:l6iie5vycng7nga4x2f7eeoxh4

Can a Robot Trust You? A DRL-Based Approach to Trust-Driven Human-Guided Navigation [article]

Vishnu Sashank Dorbala, Arjun Srinivasan, Aniket Bera
2020 arXiv   pre-print
We look at quantifying various affective features from language-based instructions and incorporate them into our policy's observation space in the form of a human trust metric.  ...  We showcase the efficacy of our results both in simulation and a real-world environment.  ...  To the best of our knowledge, ours is the first approach to quantify a robot trust metric. 2) We present an approach to extract affective features from human language commands using both text and speech  ... 
arXiv:2011.00554v1 fatcat:hzv5rder3rgtnmi5aqpdnwroci

Designing emotionally sentient agents

Daniel McDuff, Mary Czerwinski
2018 Communications of the ACM  
The field of affective computing studies and develops real systems that sense, interpret, adapt to and potentially convey human emotions. How should one design such a system for societal acceptance?  ...  In this paper, we discuss the design of emotionally sentient systems, from sensing emotional cues to the affective responding of computer agents.  ...  One can design a system that analyzes speech or text for verbal sentiment with a speech-to-text engine.  ... 
doi:10.1145/3186591 fatcat:gmnohfeuejhnreppnwuxuie22q

Multimodal Interaction System for Home Appliances Control

Hanif Fakhrurroja, Carmadi Machbub, Ary Setijadi Prihatmanto, Ayu Purwarianti
2020 International Journal of Interactive Mobile Technologies  
The average level of accuracy testing of interaction using dialogue systems and gesture are 92.5% and 79,25%. Interaction using dialogue systems is better than gesture.  ...  There are two output responses from this system, namely the audio response generator to provide feedback to the user through the sound of the computer speaker and also provide an action to control home  ...  Acknowledgement The first author acknowledges support from the Lembaga Pengelola Dana Pendidikan (Indonesia Endowment Fund for Education) scholarship, Ministry of Finance, The Republic of Indonesia.  ... 
doi:10.3991/ijim.v14i15.13563 fatcat:cwxr7yv7vbbklpqtrtwxjs3qaq


Neha Jain, Somya Rastogi
2019 Acta Informatica Malaysia  
The objective of this paper is to present the concepts about Speech Recognition Systems starting from the evolution to the advancements that have now been adapted to the Speech Recognition Systems to make  ...  Such systems find a wide area of applications in areas like signal processing problems and many more.  ...  Rajendra Kumar (Head of Department CSE/IT) of Vidya College of Engineering, Meerut, India for their constant guidance and support during the research work time period.  ... 
doi:10.26480/aim.01.2019.01.03 fatcat:adjwcuwztbcovbh4mnlrf2cjpe

Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications

Rafael A Calvo, Sidney D'Mello
2010 IEEE Transactions on Affective Computing  
This survey describes recent progress in the field of Affective Computing (AC), with a focus on affect detection.  ...  It provides meta-analyses on existing reviews of affect detection systems that focus on traditional affect detection modalities like physiology, face, and voice, and also reviews emerging research on more  ...  ACKNOWLEDGMENTS The authors gratefully acknowledge the editor, Jonathan Gratch, the associated editor, Brian Parkinson, and three  ... 
doi:10.1109/t-affc.2010.1 fatcat:ysfyjgewqreiffgu6kbilbyazi

Emotional machines: The next revolution

Valentina Franzoni, Alfredo Milani, Daniele Nardi, Jordi Vallverdú, Valentina Franzoni, Alfredo Milani, Daniele Nardi, Jordi Vallverdú
2019 Web Intelligence: an international journal  
Early approaches to speech management often merely consist of generating a text transcript of the speech and applying known text-based techniques.  ...  Affective computing or social robotics [7, 44] increased not only the necessity of studies about how to improve the emotional interaction between humans and machines [57, 59] but also how to design  ... 
doi:10.3233/web-190395 fatcat:qxucuussiveh7mskklrmwtnlgu

An integrated approach to emotional speech and gesture synthesis in humanoid robots

Philipp Robbel, Mohammed E. Hoque, Cynthia Breazeal
2009 Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots - AFFINE '09  
We describe a method for detecting basic affect signals in the user's speech input and generate appropriately chosen responses on our robot platform.  ...  This paper describes an integrated approach to recognizing and generating affect on a humanoid robot as it interacts with a human user.  ...  Retrieved from [4] . See [9] for a similar circular model of affect. as excited. The goal of this robotic agent is to drive the user away from a bored to an excited emotional state.  ... 
doi:10.1145/1655260.1655266 fatcat:y5qpui73dbg25lvi2ofrvexwje

Symmetric Multimodality Revisited: Unveiling Users' Physiological Activity

Helmut Prendinger, Mitsuru Ishizuka
2007 IEEE transactions on industrial electronics (1982. Print)  
His research interests include artificial intelligence, affective computing, and human-computer interaction, in which areas he has published more than 75 papers in international journals and conference  ...  Mitsuru Ishizuka (M'78) received the B.S., M.S., and Ph.D. degrees in electronic engineering from the  ...  However, in order to implement human-computer interfaces that are sensitive to, e.g., the user's affective state, multiple modalities have to be processed in real time.  ... 
doi:10.1109/tie.2007.891646 fatcat:wae7gulvu5gb7aotaolhxjok4m
« Previous Showing results 1 — 15 out of 34,121 results