122 Hits in 7.1 sec

See Where I am Looking at

Eunice Mwangi, Emilia Barakova, Ruixin Zhang, Marta Diaz, Andreu Catala, Matthias Rauterberg
2016 Proceedings of the Fourth International Conference on Human Agent Interaction - HAI '16  
In this study, we examine how people perceive gaze cues and head angles directed towards different target positions on a table when human and NAO robot are sitting against each other as in board game scenarios  ...  While there is considerable progress, as regards the design of social gaze cues for robots, there is little that has been done to examine the ability of humans to read and accept help signals from a robot's  ...  Because NAO lacks movable eyes and therefore has to turn its entire head to look at something, it is necessary, to establish how people perceive gaze cues while interacting with the robot.  ... 
doi:10.1145/2974804.2980479 fatcat:voai4ig46vh6hgcgorzg2dg7q4

Can Children Take Advantage of Nao Gaze-Based Hints During GamePlay?

Eunice Mwangi, Marta Diaz, Emilia Barakova, Andreu Catala, Matthias Rauterberg
2017 Proceedings of the 5th International Conference on Human Agent Interaction - HAI '17  
In one session, the robot gave hints to help the child find matching cards by looking at the correct match and, in the other session, the robot only looked at the child and did not give them any help.  ...  This paper presents a study that analyzes the effects of robots' gaze hints on children's performance in a cardmatching game.  ...  time looking at the robot.  ... 
doi:10.1145/3125739.3132613 dblp:conf/hai/MwangiDBCR17 fatcat:uaqepgup7vajdg76p3gzv2wh44

On the Imitation of Goal Directed Movements of a Humanoid Robot

Yunqing Bao, Raymond H. Cuijpers
2017 International Journal of Social Robotics  
creativecomm, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a  ...  Four different gaze timings at which the robot looked back at the participant with respect to its hand MT were implemented: (a) 0 MT: the robot always gazed at the participant, and it never gazed to its  ...  We expected that the earlier the robot gazed at participants, the shorter the reaction time would be provided that the gaze shift was perceived as a cue to take turns.  ... 
doi:10.1007/s12369-017-0417-8 fatcat:4rttlkq43zh7xmpx5wbuv7gqgu

Perception of humanoid social mediator in two-person dialogs

Yasir Tahir, Umer Rasheed, Shoko Dauwels, Justin Dauwels
2014 Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction - HRI '14  
In this study, participants are asked to participate in two-person conversations with the Nao robot as mediator.  ...  This paper presents a humanoid robot (Nao) that provides real-time sociofeedback to participants taking part in twoperson dialogs.  ...  Speak louder: "I am sorry, I cannot hear you". When one or both of the speakers are speaking too softly, Nao will ask them to increase their volume. Too noisy: "Please lower your volume".  ... 
doi:10.1145/2559636.2559831 dblp:conf/hri/TahirRDD14 fatcat:mk3lttttmvcuzlnmobyst4byri

Nonverbal Immediacy as a Characterisation of Social Behaviour for Human–Robot Interaction

James Kennedy, Paul Baxter, Tony Belpaeme
2016 International Journal of Social Robotics  
The resulting behaviour is evaluated in a more general context, where both children and adults judge the immediacy of humans and robots in a similar manner, and their recall of a short story is tested.  ...  A literature review is conducted to explore the impact on learning of the social cues which form the nonverbal immediacy measure.  ...  I am programmed to collect all that is not wanted, and at night I send it to places other humans can use it. I am a maximum efficiency machine. Did you not know?". Ricky started feeling ashamed.  ... 
doi:10.1007/s12369-016-0378-3 fatcat:yiyrwelnufc6vkoocdxwop3exq

Quantifying patterns of joint attention during human-robot interactions: An application for autism spectrum disorder assessment

Salvatore Maria Anzalone, Jean Xavier, Sofiane Boucenna, Lucia Billeci, Antonio Narzisi, Filippo Muratori, David Cohen, Mohamed Chetouani
2018 Pattern Recognition Letters  
In this paper we explore the dynamics of Joint Attention (JA) in children with Autism Spectrum Disorder (ASD) during an interaction task with a small humanoid robot.  ...  Employing the same metrics, we also assess a subgroup of 14 children with ASD after 6-month of JA training with a serious game.  ...  Authors would give many thanks to A. Arrigo, R. Grassia and G. Varni for their kind support and collaboration.  ... 
doi:10.1016/j.patrec.2018.03.007 fatcat:ggm6as4drrg2tcjhqn4x5hrvia

Evaluation of Head Gaze Loosely Synchronized With Real-Time Synthetic Speech for Social Robots

Vasant Srinivasan, Cindy L. Bethel, Robin R. Murphy
2014 IEEE Transactions on Human-Machine Systems  
This research demonstrates that robots can achieve socially acceptable interactions, using loosely synchronized head gaze-speech, without understanding the semantics of the dialog.  ...  requiring the operator to act as a puppeteer.  ...  [Pause] I am a robot that has been sent to help you. The building you were in collapsed. Please do not worry; a rescue team is aware that you are trapped and knows where you are.  ... 
doi:10.1109/thms.2014.2342035 fatcat:u3owv47mdzggvaufd6cyh7nrm4

10 Years of Human-NAO Interaction Research: A Scoping Review

Aida Amirova, Nazerke Rakhymbayeva, Elmira Yadollahi, Anara Sandygulova, Wafa Johal
2021 Frontiers in Robotics and AI  
The evolving field of human-robot interaction (HRI) necessitates that we better understand how social robots operate and interact with humans.  ...  We analyzed a wide range of theoretical, empirical, and technical contributions that provide multidimensional insights, such as general trends in terms of application, the robot capabilities, its input  ...  In a play scenario, NAO can also show excitement and enjoyment using matching phrases such as "I am really excited!," "I enjoy playing with you!"  ... 
doi:10.3389/frobt.2021.744526 pmid:34869613 pmcid:PMC8640132 fatcat:aejrcoz4rjfcncycjpatjnyufq

Adapt, Explain, Engage—A Study on How Social Robots Can Scaffold Second-language Learning of Children

Thorsten Schodde, Laura Hoffmann, Sonja Stange, Stefan Kopp
2019 ACM Transactions on Human-Robot Interaction (THRI)  
We ask if and how a social robot can be utilized to scaffold second-language learning of children at kindergarten age (4-7 years).  ...  These findings demonstrate that a social robot equipped with suitable scaffolding mechanisms can increase engagement and learning, especially when being adaptive to the individual behavior and states of  ...  I am curious to see whether you find it."  ... 
doi:10.1145/3366422 fatcat:d4yfsjj3ifgvdaedosmkut4f2q

Eight Lessons Learned about Non-verbal Interactions through Robot Theater [chapter]

Heather Knight
2011 Lecture Notes in Computer Science  
Robotics has had a long history with the field of entertainment; even the word 'robot' comes from the 1921 Czech play 'R.U.R.'  ...  Robot Theater is a fairly new arena for researching Human Robot Interaction, however, in surveying research already conducted, we have identified eight lessons from Robot Theater that inform the design  ...  What level is everyone's excitement currently at? I'm sorry. I cannot hear you. Would you please repeat your excitement, preferably at a louder volume? Thank you. I am also excited.  ... 
doi:10.1007/978-3-642-25504-5_5 fatcat:wcurkikcsnbrdddzpmqhrg7lwi

Robot in the mirror: toward an embodied computational model of mirror self-recognition [article]

Matej Hoffmann, Shengzhi Wang, Vojtech Outrata, Elisabet Alzueta, Pablo Lanillos
2020 arXiv   pre-print
Second, we develop a model to enable the humanoid robot Nao to pass the test.  ...  The architecture is tested on two robots with a completely different face.  ...  gaze / eye tracking: where do the subjects look: (i) target (mark on the face) in the mirror, (ii) alternate between target and their hand in the mirror, (iii) look at their hand directly elements of tactile  ... 
arXiv:2011.04485v1 fatcat:ousnbkjinjcl3lwxktqw6vl6mm

Dances with Robots

Catie Cuan
2021 TDR: The Drama Review  
What does it feel like to dance with a robot? How do you choreograph one?  ...  Working with robots during three artistic residencies and two research projects has raised questions about agency and generative processes, revealing how dancing with robots may provoke a more interanimate  ...  As I gaze long into another room rendered on one of them, I see my torso reflected in the screen glass, staring back at me.  ... 
doi:10.1017/s105420432000012x fatcat:yfp4jsemkzdmfeqjkt5mo3dpvq

Feasibility Study on the Role of Personality, Emotion, and Engagement in Socially Assistive Robotics: A Cognitive Assessment Scenario

Alessandra Sorrentino, Gianmaria Mancioppi, Luigi Coviello, Filippo Cavallo, Laura Fiorini
2021 Informatics  
To this end, we organized an experimental session with 11 elderly users who performed a cognitive assessment with the non-humanoid ASTRO robot.  ...  On the contrary, the cognitive assessment with the robot significantly reduced the anxiety of the users, by enhancing the trust in the robotic entity.  ...  Acknowledgments: The authors would like to thank the director, the medical staff and the residents of the residential Facility where the experimentation took place for their availability.  ... 
doi:10.3390/informatics8020023 fatcat:5dyyjsxjt5bi7lxzimwfn4in3y

Theory of Robot Communication: I. The Medium is the Communication Partner [article]

Johan F. Hoorn
2018 arXiv   pre-print
In view of the rise of social robots in coming years, I define the theoretical precepts of a possible next step in CMC, which I elaborate in a second paper.  ...  As a result, I argue for an addition to CMC theorizing when the robot as a medium itself becomes the communication partner.  ...  Acknowledgments I am thankful to Elly A. Konijn for commenting on an earlier draft of this paper.  ... 
arXiv:1812.04408v1 fatcat:j6inlrj4cffyxiiih34qtjeuvy

Multimodal data collection of human-robot humorous interactions in the Joker project

Laurence Devillers, Sophie Rosset, Guillaume Dubuisson Duplessis, Mohamed A. Sehili, Lucile Bechade, Agnes Delaborde, Clement Gossart, Vincent Letard, Fan Yang, Yucel Yemez, Bekir B. Turker, Metin Sezgin (+6 others)
2015 2015 International Conference on Affective Computing and Intelligent Interaction (ACII)  
This paper presents a data collection of social interaction dialogs involving humor between a human participant and a robot.  ...  One of the major contributions of this work is to provide a context to study human laughter produced during a human-robot interaction.  ...  In the end, the system closes the interaction by drawing a conclusion about the perceived reactions from the human (e.g., "I am very glad you like humor produced by a robot.").  ... 
doi:10.1109/acii.2015.7344594 dblp:conf/acii/DevillersRDSBDG15 fatcat:jh6zcr2r6rhc7kzgelgrfsreiu
« Previous Showing results 1 — 15 out of 122 results