Filters








2,003 Hits in 4.8 sec

Effects of nonverbal communication on efficiency and robustness in human-robot teamwork

C. Breazeal, C.D. Kidd, A.L. Thomaz, G. Hoffman, M. Berlin
2005 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems  
effectiveness human-robot teamwork where the robot serves as a cooperative partner.  ...  We report our results from an experiment where naïve human subjects guide a robot to perform a physical task using speech and gesture.  ...  Naïve human subjects were asked to instruct an autonomous humanoid robot using speech and gesture to perform a simple physical task. The robot does not speak.  ... 
doi:10.1109/iros.2005.1545011 dblp:conf/iros/BreazealKTHB05 fatcat:pdtqymnzabfgxovyrs5idlznbe

Interactive Robot Learning of Gestures, Language and Affordances [article]

Giovanni Saponaro, Lorenzo Jamone, Alexandre Bernardino, Giampiero Salvi
2017 arXiv   pre-print
A growing field in robotics and Artificial Intelligence (AI) research is human-robot collaboration, whose target is to enable effective teamwork between humans and robots.  ...  We propose a model that unites (i) learning robot affordances and word descriptions with (ii) statistical recognition of human gestures with vision sensors.  ...  We thank Konstantinos Theofilis for his software and help permitting the acquisition of human hand coordinates in human-robot interaction scenarios with the iCub robot. References  ... 
arXiv:1711.09055v1 fatcat:2akajohvyfhtrdylwe6wgr75fe

Communicating with Teams of Cooperative Robots [chapter]

D. Perzanowski, A. C. Schultz, W. Adams, M. Bugajska, E. Marsh, J. G. Trafton, D. Brock, M. Skubic, M. Abramson
2002 Multi-Robot Systems: From Swarms to Intelligent Automata  
An integrated context and dialog processing component that incorporates knowledge of spatial relations enables cooperative activity between the multiple agents, both human and robotic.  ...  We are designing and implementing a multi-modal interface to a team of dynamically autonomous robots. For this interface, we have elected to use natural language and gesture.  ...  ACKNOWLEDGMENTS The Naval Research Laboratory and the Office of Naval Research partly funded this research.  ... 
doi:10.1007/978-94-017-2376-3_20 fatcat:zfvaydxitvfr7lcis3mw7xnw4m

HRI Workshop on Human-Robot Teaming

Bradley Hayes, Ivana Kruijff-Korbayova, Maarten Sierhuis, Julie A. Shah, Brian Scassellati, Matthew C. Gombolay, Malte F. Jung, Koen Hindriks, Joachim de Greeff, Catholijn Jonker, Mark Neerincx, Jeffrey M. Bradshaw (+1 others)
2015 Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts - HRI'15 Extended Abstracts  
Developing collaborative robots that can productively and safely operate out of isolation in uninstrumented, humanpopulated environments is an important goal for the field of robotics.  ...  when possible, and provide instruction or guidance when necessary is a long-term goal of Human-Robot Interaction research.  ...  Keywords Robotics; Collaboration; Teamwork; HRI; HART 1.  ... 
doi:10.1145/2701973.2714396 dblp:conf/hri/HayesGJHGJNBJKS15 fatcat:ewrlxuxy55bkrevgtk2ypangxy

Adjustable Autonomy and Human-Agent Teamwork in Practice: An Interim Report on Space Applications [chapter]

Jeffrey M. Bradshaw, Maarten Sierhuis, Alessandro Acquisti, Paul Feltovich, Robert Hoffman, Renia Jeffers, Debbie Prescott, Niranjan Suri, Andrzej Uszok, Ron Van Hoof
2003 Multiagent Systems, Artificial Societies, and Simulated Organizations  
, as well as those who are interested in design and execution tools for teams of robots that can function as effective assistants to humans.  ...  We then summarize the interim results of our study on the problem of work practice modeling and human-agent collaboration in space applications, the development of a broad model of human-agent teamwork  ...  Under NASA sponsorship, we are investigating issues in human-robotic teamwork and adjustable autonomy.  ... 
doi:10.1007/978-1-4419-9198-0_11 fatcat:edocqjsbzzfhxpbecsvtmzzdjy

Computational Human-Robot Interaction

Andrea Thomaz, Guy Hoffman, Maya Cakmak
2016 Foundations and Trends in Robotics  
Nonverbal Behavior Affect and Emotion • Recognizing Humans and Human Poses • Face and Person Recognition • Gesture and Activity Recognition • Pointing and Hand Gestures • Detecting Engagement  ...  Activity recognition for natural human robot interaction. In Social Robotics, pages 84-94. Springer, 2014. V. Chu, K. Bullard, and A. L. Thomaz.  ... 
doi:10.1561/2300000049 fatcat:qo32i5ts7vbxlj2oo2hywt376y

Teamwork in Animals, Robots, and Humans [chapter]

Carl Anderson, Nigel R. Franks
2003 Advances in the Study of Behavior  
1 school of industrial and systems engineering georgia institute of technology atlanta, georgia 30332-0205, usa 2 school of biological sciences university of bristol bristol, bs8 1ug uk 1  ...  We also thank Alain Dejean, Turid Hö lldobler-Forsyth, and Matt Quinn for permission to publish the drawings and photographs that appear in our figures.  ...  Acknowledgments We thank Tucker Balch, Peter Godwin, Stephen Harris, Peter Neumann, Scott Powell, Matt Quinn, Tim Roper, Peter Saul, Peter Slater, and an anonymous referee for their help and suggestions  ... 
doi:10.1016/s0065-3454(03)33001-3 fatcat:2sy57gqmu5cwjkcx4midgzclea

Comparative Study on the Educational Use of Home Robots for Children

Jeong-Hye Han, Mi-Heon Jo, Vicki Jones, Jun-H. Jo
2008 Journal of Information Processing Systems  
This study compared the effects of non-computer based (NCB) media (using a book with audiotape) and Web-Based Instruction (WBI), with the effects of Home Robot-Assisted Learning (HRL) for children.  ...  Human-Robot Interaction (HRI), based on already well-researched Human-Computer Interaction (HCI), has been under vigorous scrutiny since recent developments in robot technology.  ...  A comparison was also made between the effect of using home robots and other instructional media such as NCB and WBI.  ... 
doi:10.3745/jips.2008.4.4.159 fatcat:7jm4n3zm5vgu7k5z7dwjzknq4a

Joint action perception to enable fluent human-robot teamwork

Tariq Iqbal, Michael J. Gonzales, Laurel D. Riek
2015 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)  
In this paper, we describe an event-based model for multiple robots to automatically measure synchronous joint action of a group while both the robots and co-present humans are moving.  ...  This is a challenging perceptual task when both the robots and people are in motion.  ...  While this work is useful for some perceptual situations, from a human-robot teamwork perspective it may not tell robots much information about the context of how humans are interacting within the environment  ... 
doi:10.1109/roman.2015.7333671 dblp:conf/ro-man/IqbalGR15 fatcat:ng4rexnjdbdq3gv6ynhweln2v4

Effects of anticipatory perceptual simulation on practiced human-robot tasks

Guy Hoffman, Cynthia Breazeal
2009 Autonomous Robots  
We also show the robot and the human to improve their relative contribution at a similar rate, possibly playing a part in the human's "like-me" perception of the robot.  ...  We also find differences in verbal attitudes towards the robot: most notably, subjects working with the anticipatory robot attribute more human qualities to the robot, such as gender and intelligence,  ...  This supports our hypothesis that top-down anticipatory perceptual simulation can aid in fluent human-robot teamwork in which a human and a robot jointly practice a task.  ... 
doi:10.1007/s10514-009-9166-3 fatcat:v6lapajtsfc7hkqbt4gdsithuq

A survey of robot learning from demonstrations for Human-Robot Collaboration [article]

Jangwon Lee
2017 arXiv   pre-print
Since there are different aspects between stand-alone tasks and collaborative tasks, researchers should consider these differences to design collaborative robots for more effective and natural human-robot  ...  In this regard, many researchers have shown an increased interest in to make better communication framework between robots and humans because communication is a key issue to apply LfD paradigm for human-robot  ...  Gesture recognition is also widely used for human-robot collaboration since gestures can be one of the effective communication channels between humans and robots for working together [23] .  ... 
arXiv:1710.08789v1 fatcat:bch3zc6tkfh4vawut36qbmgxde

Designing for Young Children: Learning, Practice, and Decisions

Yanghee Kim, Diantha Smith
2018 International Journal of Designs for Learning  
The design challenges and the way we address them may be useful for others developing similar interventions for young children.  ...  We present a few snapshots from the design case to illustrate the teamwork and design enhancement.  ...  In individual use, a child had a time to build confidence with a friend-like robot; in small-group use, the robot served as a center for collaborative work among human peers.  ... 
doi:10.14434/ijdl.v9i1.23099 fatcat:p5kyzxvxgfgzdgeyh76giqy5ei

Human-Inspired Robots

S. Coradeschi, H. Ishiguro, M. Asada, S.C. Shapiro, M. Thielscher, C. Breazeal, M.J. Mataric, H. Ishida
2006 IEEE Intelligent Systems  
Bridging science and engineering One way to tackle the issue is to use a humanlike robot-an android-to study human-robot interaction.  ...  The study of human-robot interaction has been neglecting an issue-appearance and behavior.  ...  By making its learning process transparent to the human, the robot actively improves its own learning environment by helping the human better tune their instruction for the robot.  ... 
doi:10.1109/mis.2006.72 fatcat:6gx6herhz5f3xfcyedfrscbaxy

Detecting and Synthesizing Synchronous Joint Action in Human-Robot Teams

Tariq Iqbal, Laurel D. Riek
2015 Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI '15  
To become capable teammates to people, robots need the ability to interpret human activities and appropriately adjust their actions in real time.  ...  The goal of our research is to build robots that can work fluently and contingently with human teams.  ...  The employment of this method to a human-robot teamwork scenario, where both the humans and the robots were in motion [9] . 3.  ... 
doi:10.1145/2818346.2823315 dblp:conf/icmi/IqbalR15 fatcat:2fhffqsd7nb3hcpnficz5eleai

Towards High-Level Human Activity Recognition through Computer Vision and Temporal Logic [chapter]

Joris Ijsselmuiden, Rainer Stiefelhagen
2010 Lecture Notes in Computer Science  
Our laboratory for investigating new ways of human-machine interaction and teamwork support, is equipped with an assemblage of cameras, some close-talking microphones, and a videowall as main interaction  ...  We also monitor the users' speech activity in real time. This paper explains our approach to highlevel activity recognition based on these perceptual components and a temporal logic engine.  ...  Besides for applications in human-machine interaction, we use computer vision, speech recognition, and high-level activity recognition, to automatically generate reports and visualizations.  ... 
doi:10.1007/978-3-642-16111-7_49 fatcat:ko77ydxvnffvnijoqujmknst3i
« Previous Showing results 1 — 15 out of 2,003 results