813 Hits in 5.5 sec

Improving sign language processing via few-shot machine learning

G.F. Shovkoplias, D.A. Strokov, D.V. Kasantsev, A.S. Vatian, A.A. Asadulaev, I.V. Tomilov, A.A. Shalyto, N.F. Gusarova
2022 Naučno-tehničeskij Vestnik Informacionnyh Tehnologij, Mehaniki i Optiki  
The method is based on the registration of electromyographic (EMG) muscle signals using bracelets worn on the arm.  ...  speech disabilities.  ...  The first approach [21, 22] uses gradient descent-based methods as the optimization process in the meta-learning, sharing knowledge across all tasks.  ... 
doi:10.17586/2226-1494-2022-22-3-559-566 fatcat:dh7w3dz6ejbixnnmazrzinhmne

How Can AI Recognize Pain and Express Empathy [article]

Siqi Cao, Di Fu, Xu Yang, Pablo Barros, Stefan Wermter, Xun Liu, Haiyan Wu
2021 arXiv   pre-print
Finally, we identify possible future implementations of artificial empathy and analyze how humans might benefit from an AI agent equipped with empathy.  ...  How can we create an AI agent with proactive and reactive empathy?  ...  Fig. 1 . 1 An overall illustration of all topics in this review. Fig. 2 . 2 A meta-learning process of speech-based pain assessment.  ... 
arXiv:2110.04249v1 fatcat:f6gvjacowfafnfybb2mgoip2ni

Sensors and Artificial Intelligence Methods and Algorithms for Human–Computer Intelligent Interaction: A Systematic Mapping Study

Boštjan Šumak, Saša Brdnik, Maja Pušnik
2021 Sensors  
Researchers most often apply deep-learning and instance-based AI methods and algorithms.  ...  The support sector machine (SVM) is the most widely used algorithm for various kinds of recognition, primarily an emotion, facial expression, and gesture.  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s22010020 pmid:35009562 pmcid:PMC8747169 fatcat:2kwsqyrsmvftpgfgme2xpptb7q

Comparison and Efficacy of Synergistic Intelligent Tutoring Systems with Human Physiological Response

Fehaid Alqahtani, Naeem Ramzan
2019 Sensors  
They have found promising application in the field of computer-based learning and tutoring.  ...  The synergism of ITS and physiological signals in automated tutoring systems adapted to the learner's emotions and mental states are presented and compared.  ...  In literature, CBT is also recognized as Computer-Based Learning (CBL) or Computer-Based Instruction (CBI). CBT is a large set and ITS are a subgroup of CBT.  ... 
doi:10.3390/s19030460 fatcat:ni5uc5mxcvec3axj4dqb574yle

How to manage affective state in child-robot tutoring interactions?

Thorsten Schodde, Laura Hoffmann, Stefan Kopp
2017 2017 International Conference on Companion Technology (ICCT)  
The role of affective states during learning has so far only scarcely been considered in such systems, because it is unclear which cues should be tracked, how they should be interpreted, and how the system  ...  Therefore, we conducted expert interviews with preschool teachers, and based on these results suggest a conceptual model for tracing and managing the affective state of preschool children during robot-child  ...  This work was supported by the Cluster of Excellence Cognitive Interaction Technology 'CITEC' (EXC 277) at Bielefeld University, funded by the German Research Foundation (DFG), and by the L2TOR (  ... 
doi:10.1109/companion.2017.8287073 dblp:conf/companion/SchoddeHK17 fatcat:ezbb7uyzy5cbxbn4znjiiv7nda

Towards the automatic detection of social biomarkers in autism spectrum disorder: introducing the simulated interaction task (SIT)

Hanna Drimalla, Tobias Scheffer, Niels Landwehr, Irina Baskow, Stefan Roepke, Behnoush Behnia, Isabel Dziobek
2020 npj Digital Medicine  
Using machine-learning tools, we detected individuals with ASD with an accuracy of 73%, sensitivity of 67%, and specificity of 79%, based on their facial expressions and vocal characteristics alone.  ...  We present a digital tool to automatically quantify biomarkers of social interaction deficits: the simulated interaction task (SIT), which entails a standardized 7-min simulated dialog via video and the  ...  We acknowledge support by the German Research Foundation (DFG) and the Open Access Publication Fund of Humboldt-Universität zu Berlin.  ... 
doi:10.1038/s41746-020-0227-5 pmid:32140568 pmcid:PMC7048784 fatcat:j5eop25lkbhn3kzi6cwbbeood4

Human Emotion Recognition: Review of Sensors and Methods

Dzedzickis, Kaklauskas, Bucinskas
2020 Sensors  
Automated emotion recognition (AEE) is an important issue in various fields of activities which use human emotional reactions as a signal for marketing, technical equipment, or human–robot interaction.  ...  Based on the analyzed human emotion recognition sensors and methods, we developed some practical applications for humanizing the Internet of Things (IoT) and affective computing systems.  ...  A new approach to enhance driving safety via multi-media technologies by recognizing and adapting to drivers' emotions (neutrality, panic/fear, frustration/anger, boredom/sleepiness) with multi-modal intelligent  ... 
doi:10.3390/s20030592 pmid:31973140 pmcid:PMC7037130 fatcat:tkv5jgffarg3xntjmslnzdt3tm

Hand-Gesture Recognition Based on EMG and Event-Based Camera Sensor Fusion: A Benchmark in Neuromorphic Computing

Enea Ceolini, Charlotte Frenkel, Sumit Bam Shrestha, Gemma Taverni, Lyes Khacef, Melika Payvand, Elisa Donati
2020 Frontiers in Neuroscience  
In this paper, we present a fully neuromorphic sensor fusion approach for hand-gesture recognition comprised of an event-based vision sensor and three different neuromorphic processors.  ...  Hand gestures are a form of non-verbal communication used by individuals in conjunction with speech to communicate.  ...  Finally, we thank Garrick Orchard for supporting us with the use of the Loihi platform and the useful comments to the paper.  ... 
doi:10.3389/fnins.2020.00637 pmid:32903824 pmcid:PMC7438887 fatcat:jbelvtolezbw7kamovfl6pxiam

Multiple Sensors Based Hand Motion Recognition Using Adaptive Directed Acyclic Graph

Yaxu Xue, Zhaojie Ju, Kui Xiang, Jing Chen, Honghai Liu
2017 Applied Sciences  
Recognising different grasp and manipulation tasks based on the combined signals is investigated by using an adaptive directed acyclic graph algorithm, and results of comparative experiments show the proposed  ...  The use of human hand motions as an effective way to interact with computers/robots, robot manipulation learning and prosthetic hand control is being researched in-depth.  ...  In this paper, we use an Adaptive DAG (ADAG) [25] .  ... 
doi:10.3390/app7040358 fatcat:rnxndd4qpjcexdi2xr4bsoypli

Physiological-Based Emotion Detection and Recognition in a Video Game Context

Wenlu Yang, Maria Rifqi, Christophe Marsala, Andrea Pinna
2018 2018 International Joint Conference on Neural Networks (IJCNN)  
We then investigate physiological-based emotion detection and recognition by using machine learning techniques.  ...  Most physiological-based affective gaming applications evaluate player's emotion on an overall game fragment. These approaches fail to capture the emotion change in the dynamic game context.  ...  Furthermore, the authors gratefully thank Victor Billot for aiding carrying out the experiment and also members of INSEAD-Sorbonne Universités Behavioural Lab: Liselott Pettersson, Jean-Yves Mariette,  ... 
doi:10.1109/ijcnn.2018.8489125 dblp:conf/ijcnn/YangRMP18 fatcat:augvon24yrghleaiv45t2puhry

Automatic Emotion Recognition in Children with Autism: A Systematic Literature Review

Agnieszka Landowska, Aleksandra Karpus, Teresa Zawadzka, Ben Robins, Duygun Erol Barkana, Hatice Kose, Tatjana Zorcec, Nicholas Cummins
2022 Sensors  
The paper aims at the exploration of methods and tools used to recognize emotions in children.  ...  Diverse observation channels and modalities are used in the analyzed studies, including facial expressions, prosody of speech, and physiological signals.  ...  The obstacles were classified into activity-based, child condition-based, and setup-based ones, which is an important distinction.  ... 
doi:10.3390/s22041649 pmid:35214551 pmcid:PMC8875834 fatcat:lr4zoons3zhf7anl7o4ypze2ga

Emotion Recognition from Multiple Modalities: Fundamentals and Methodologies [article]

Sicheng Zhao, Guoli Jia, Jufeng Yang, Guiguang Ding, Kurt Keutzer
2021 arXiv   pre-print
adaptation for MER.  ...  Furthermore, we present some representative approaches on representation learning of each affective modality, feature fusion of different affective modalities, classifier optimization for MER, and domain  ...  Eyes movement signals can be easily collected via an eye tracker system, and have been widely used in human-computer interaction research.  ... 
arXiv:2108.10152v1 fatcat:hwnq7hoiqba3pdf6aakcxjq33i

Intensive virtual reality and robotic based upper limb training compared to usual care, and associated cortical reorganization, in the acute and early sub-acute periods post-stroke: a feasibility study

Jigna Patel, Gerard Fluet, Qinyin Qiu, Mathew Yarossi, Alma Merians, Eugene Tunik, Sergei Adamovich
2019 Journal of NeuroEngineering and Rehabilitation  
Specifically, the study investigated whether an additional 8 h of specialized, intensive (200-300 separate hand or arm movements per hour) virtual reality (VR)/robotic based upper limb training introduced  ...  that of a control group.  ...  Supriya Massood DO, and the clerical, nursing, and rehabilitation staff of the Acute Rehabilitation Department at St.  ... 
doi:10.1186/s12984-019-0563-3 pmid:31315612 pmcid:PMC6637633 fatcat:7shbtnatgfhstkidd6zc4jwsmi

Closed-Loop Brain–Machine–Body Interfaces for Noninvasive Rehabilitation of Movement Disorders

Frédéric D. Broccard, Tim Mullen, Yu Mike Chi, David Peterson, John R. Iversen, Mike Arnold, Kenneth Kreutz-Delgado, Tzyy-Ping Jung, Scott Makeig, Howard Poizner, Terrence Sejnowski, Gert Cauwenberghs
2014 Annals of Biomedical Engineering  
We then present a novel, transformative, noninvasive closed-loop framework based on force neurofeedback and discuss several future developments of closed-loop systems that might bring us closer to individualized  ...  Recently, the more invasive method of deep brain stimulation (DBS) showed significant improvement of the physical symptoms associated with these disorders.  ...  This model-based approach (''From Spikes to Behavior'' Section) takes inspiration from the coupling of the BMI user-the PD patient-with an intelligent controller via reinforcement learning 34, 38, 97,  ... 
doi:10.1007/s10439-014-1032-6 pmid:24833254 pmcid:PMC4099421 fatcat:4jbihnphwbelhhamj7a7svrnmm

VR-PEER: A Personalized Exer-Game Platform Based on Emotion Recognition

Yousra Izountar, Samir Benbelkacem, Samir Otmane, Abdallah Khababa, Mostefa Masmoudi, Nadia Zenati
2022 Electronics  
This paper proposed a VR-PEER adaptive exer-game system based on emotion recognition.  ...  An experimental study has been conducted on fifteen subjects who expressed the usefulness of the proposed system in motor rehabilitation process.  ...  The system developed used CNN for hand gesture recognition via EMG signals.  ... 
doi:10.3390/electronics11030455 fatcat:p6fmhu6pwzcujhnhgnu3mso4pa
« Previous Showing results 1 — 15 out of 813 results