A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2015; you can also visit the original URL.
The file type is application/pdf
.
Filters
Embodied Collaborative Referring Expression Generation in Situated Human-Robot Interaction
2015
Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction - HRI '15
To facilitate referential communication between humans and robots and mediate their differences in representing the shared environment, we are exploring embodied collaborative models for referring expression ...
Instead of a single minimum description to describe a target object, episodes of expressions are generated based on human feedback during human-robot interaction. ...
ACKNOWLEDGMENTS This work was supported by IIS-1208390 from the National Science Foundation and N00014-11-1-0410 from the Office of Naval Research. ...
doi:10.1145/2696454.2696467
dblp:conf/hri/FangDC15
fatcat:nnt6oramfvetffrwvm74f3oodi
Planning with Verbal Communication for Human-Robot Collaboration
[article]
2017
arXiv
pre-print
Human collaborators coordinate effectively their actions through both verbal and non-verbal communication. We believe that the the same should hold for human-robot teams. ...
We propose a formalism that enables a robot to decide optimally between doing a task and issuing an utterance. ...
c r , c a c r , 1 − c Figure 2 . : Human adaptation model that accounts for verbal commands. ...
arXiv:1706.04694v1
fatcat:li6bld222fapblvpnlketptdle
The Influence of Robot Verbal Support on Human Team Members: Encouraging Outgroup Contributions and Suppressing Ingroup Supportive Behavior
2020
Frontiers in Psychology
human-robot team comprised of 2 human ingroup members, 1 human outgroup member, and 1 robot. ...
In this work, we investigated the effects of verbal support from a robot (e.g., "good idea Salim," "yeah") on human team members' interactions related to psychological safety and inclusion. ...
We also thank Tom Wallenstein and Sean Hackett for their assistance in data collection and in the development of the backchanneling annotation coding scheme. ...
doi:10.3389/fpsyg.2020.590181
pmid:33424708
pmcid:PMC7793683
fatcat:np6hy7apavepzgs2ymgdbm2oue
Can a child feel responsible for another in the presence of a robot in a collaborative learning activity?
2015
2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
In order to explore the impact of integrating a robot as a facilitator in a collaborative activity, we examined interpersonal distancing of children both with a human adult and a robot facilitator. ...
Our study involved 40 children between 6 and 8 years old, in two conditions (robot or human facilitator). ...
Without the collaboration of the school principal, all involved teachers, children and parents, this study would not have been possible. ...
doi:10.1109/roman.2015.7333678
dblp:conf/ro-man/ChandraALSPD15
fatcat:wgtjmwillzfcdmkacpdqi3yemy
Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design
2008
International Journal of Advanced Robotic Systems
The visual channel should allow the robot to recognize and interpret human non-verbal communication cues and allow the robot to express some non-verbal cues that a human can naturally understand. ...
and proposes a holistic architectural design for human-robot collaboration. ...
Acknowledgements We would like to acknowledge the collaboration of Randy Stiles and Scott Richardson at the Lockheed Martin Space Systems Company, Sunnyvale California, USA. ...
doi:10.5772/5664
fatcat:pmkrifjifve27a2pdvz36rocwy
Sociable Robots
2006
Journal of the Robotics Society of Japan
Human-Robot Collaboration With respect to human-robot teamwork, collaborative discourse theory has provided a rich theoretical framework to understand and model human-style collaboration for human-robot ...
We have demonstrated our robot's ability to apply its mindreading skills to compare and reason about how its human partner's goal and belief states (as communicated through verbal and non-verbal behavior ...
doi:10.7210/jrsj.24.591
fatcat:nn7h5nvth5go5ls5vrlsbhffni
How Robot Verbal Feedback Can Improve Team Performance in Human-Robot Task Collaborations
2015
Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction - HRI '15
We detail an approach to planning effective verbal feedback during pairwise human-robot task collaboration. ...
A user study was conducted to experimentally validate the efficacy of the approach on a task in which a single user collaborates with an autonomous robot. ...
proof of concept designed to evaluate the approach in a co-located human-robot task collaboration as compared to a non-communicating robot. ...
doi:10.1145/2696454.2696491
dblp:conf/hri/ClairM15
fatcat:pbm3fbz7qbh4zfptkxa3bv6ft4
Effects of nonverbal communication on efficiency and robustness in human-robot teamwork
2005
2005 IEEE/RSJ International Conference on Intelligent Robots and Systems
In this paper, we explore the impact of non-verbal social cues and behavior on task performance by a human-robot team. ...
We report our results from an experiment where naïve human subjects guide a robot to perform a physical task using speech and gesture. ...
Our own work in mixed-initiative human-robot teamwork grounds these theoretical ideas for the case where a human and a humanoid robot work collaboratively to perform a physical task in a shared workspace ...
doi:10.1109/iros.2005.1545011
dblp:conf/iros/BreazealKTHB05
fatcat:pdtqymnzabfgxovyrs5idlznbe
Human-robot collaborative tutoring using multiparty multimodal spoken dialogue
2014
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction - HRI '14
A human-robot interaction setup is designed, and a human-human dialogue corpus is collected. ...
These are used build a situated model of the interaction based on the participants personalities, their state of attention, their conversational engagement and verbal dominance, and how that is correlated ...
Furhat was developed to support non-verbally and dynamically rich audio-visual synthesis, and to study human-robot spoken interactions [3, 4] , together with the newly developed IrisTK dialogue platform ...
doi:10.1145/2559636.2563681
dblp:conf/hri/MoubayedBBGHJKLNOSSV14
fatcat:sqpyhr4il5h35bmiz6tnfeq5ue
From human action understanding to robot action execution: how the physical properties of handled objects modulate non-verbal cues
2020
2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)
Humans manage to communicate action intentions in a non-verbal way, through body posture and movement. ...
We close the loop from action understanding to robot action execution with an adaptive and robust controller based on the learned classifier, and evaluate the entire pipeline on a collaborative task with ...
Learning robot dynamics model from human motion improves human-robot collaboration making it more predictable. ...
doi:10.1109/icdl-epirob48136.2020.9278084
fatcat:6zdiuag5yjcpxj46ixwa3nawky
Interactive, Collaborative Robots: Challenges and Opportunities
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
In the envisioned future factory setups, home and office environments, humans and robots will share the same workspace and perform different object manipulation tasks in a collaborative manner. ...
Industrial robots today are still largely preprogrammed for their tasks, not able to detect errors in their own performance or to robustly interact with a complex environment and a human worker. ...
Acknowledgements This work was supported by the Swedish Foundation for Strategic Research (SSF) and Knut and Alice Wallenberg Foundation through the WASP project. ...
doi:10.24963/ijcai.2018/3
dblp:conf/ijcai/KragicGKJ018
fatcat:xhg22csfwbbifausf3o5xsoh24
Studying verbal feedback in human collaborations to inform robot speech production
2014
2014 International Conference on Collaboration Technologies and Systems (CTS)
This short paper motivates the use of robot verbal feedback in human-robot task collaboration scenarios and presents results from a pilot study aimed at identifying how people use speech to coordinate ...
From these results, three types of verbal feedback are identified as well as requirements for a robot to correctly employ these speech patterns while collaborating with a person. ...
ACKNOWLEDGMENT This work was conducted at the Interaction Lab, part of the Robotics and Autonomous Systems Center (RASC) at the University of Southern California and supported in part by National Science ...
doi:10.1109/cts.2014.6867555
dblp:conf/cts/ClairM14
fatcat:2ejkyzgypnhr3e4j5m5biijoba
The Implications of Interactional "Repair" for Human-Robot Interaction Design
2011
2011 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology
This paper recaps a recent study in the organization of interactive practices utilized by humans in card-game activities for the purposes of informing the design of humanrobot interaction with autonomous ...
We conclude this paper with a brief discussion of the technical implications for future design of autonomous social robots. social robots; social interaction; conversation analysis; video analysis; repair ...
We also wish to thank our colleague, Margaret Szymanski, for her invaluable help in preparing this manuscript for publication, and her continued collaboration on socio-technical research projects. ...
doi:10.1109/wi-iat.2011.213
dblp:conf/iat/PlurkowskiCV11
fatcat:uocrtdeeufhutkmalzh4k6axum
Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots in Long-Term Care
[chapter]
2021
Information Systems - Intelligent Information Processing Systems [Working Title]
Furthermore, it is critical to develop a friendlier type of robot by equipping it with non-verbal emotive expressions that older people can perceive. ...
As for its hardware, by allowing independent range of action and degree of freedom, the burden of quality exerted in human-robot communication is reduced, thereby unburdening nurses and professional caregivers ...
Savina Schoenhofer for reviewing this chapter prior to its publication. ...
doi:10.5772/intechopen.99062
fatcat:nxtixv4g6jaijlwwyahwitfivq
Children's reliance on the non-verbal cues of a robot versus a human
2019
PLoS ONE
Robots are used for language tutoring increasingly often, and commonly programmed to display non-verbal communicative cues such as eye gaze and pointing during robot-child interactions. ...
Here, we assessed whether four- to six-year-old children (i) differed in their weighing of non-verbal cues (pointing, eye gaze) and verbal cues provided by a robot versus a human; (ii) weighed non-verbal ...
We would like to thank Annelies Boeve, Loes Hermelink, Bente Homan, and Michelle Zeelenberg for collecting the data, and Esmee Kramer for coding the data of the disambiguation task. Leseman. ...
doi:10.1371/journal.pone.0217833
pmid:31856239
pmcid:PMC6922398
fatcat:ptilrqjoxreq3jpmnsmdamvllu
« Previous
Showing results 1 — 15 out of 10,214 results