A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Dialogue with Robots to Support Symbiotic Autonomy
[chapter]
2016
Lecture Notes in Electrical Engineering
Given the recent advancements in spoken language recognition, dialog in natural language will be a major component of robotic interfaces, even though it will be certainly coupled with a number of multi-modal ...
We have addressed Spoken Command Interpretation by implementing a cascade of reusable NLP modules, that can be adapted to changing operational scenarios, through trainable statistical models for which ...
Given the recent advancements in spoken language recognition, dialog in natural language will be a major component of robotic interfaces, even though it will be certainly coupled with a number of multi-modal ...
doi:10.1007/978-981-10-2585-3_27
fatcat:qftghjaulncubg3vwmfta3dmlu
Learning environmental knowledge from task-based human-robot dialog
2013
2013 IEEE International Conference on Robotics and Automation
This paper presents an approach for learning environmental knowledge from task-based human-robot dialog. ...
Previous approaches to dialog use domain knowledge to constrain the types of language people are likely to use. ...
We would also like to acknowledge Robin Soetens for his contributions to later versions of this system. ...
doi:10.1109/icra.2013.6631186
dblp:conf/icra/KollarPNV13
fatcat:gs3vhz2wfbhmzkm6suwcjmfx2e
The RobotSlang Benchmark: Dialog-guided Robot Localization and Navigation
[article]
2020
arXiv
pre-print
To study such cooperative communication, we introduce Robot Simultaneous Localization and Mapping with Natural Language (RobotSlang), a benchmark of 169 natural language dialogs between a human Driver ...
Autonomous robot systems for applications from search and rescue to assistive guidance should be able to engage in natural language dialog with people. ...
Acknowledgments The authors are supported in part by ARO grant (W911NF-16-1-0121) and by the US National Science Foundation National Robotics Initiative under Grants 1522904. ...
arXiv:2010.12639v1
fatcat:k53mcuuvmrghhivcmcyx63qiea
Human-Robot Interaction Through Gesture-Free Spoken Dialogue
2004
Autonomous Robots
We present an approach to hands-free interaction with autonomous robots through spoken dialog. ...
Our approach is based on passive knowledge rarefication through goal disambiguation, a technique that allows the robot to refine and acquire knowledge through spoken dialog with a human operator. ...
Command is a minimum requirement for human-robot dialog-based interaction in that the robot must at least be able to execute direct commands from the operator. Let us go through an example. ...
doi:10.1023/b:auro.0000025789.33843.6d
fatcat:ahgwuvzfbzc5jhc6e2ag5ertr4
Augmenting Knowledge through Statistical, Goal-oriented Human-Robot Dialog
[article]
2019
arXiv
pre-print
Some robots can interact with humans using natural language, and identify service requests through human-robot dialog. ...
In this paper, we develop a dialog agent for robots that is able to interpret user commands using a semantic parser, while asking clarification questions using a probabilistic dialog manager. ...
ACKNOWLEDGEMENTS We are grateful to the BWI team at UT Austin for making their software available to the public. ...
arXiv:1907.03390v2
fatcat:h2e5uybukvbr7ca4vdq6gci3zm
Exploiting Deep Semantics and Compositionality of Natural Language for Human-Robot-Interaction
[article]
2016
arXiv
pre-print
We develop a natural language interface for human robot interaction that implements reasoning about deep semantics in natural language. ...
This also includes verbal interaction with humans to clarify commands and queries that are too ambiguous to be executed safely. ...
The work presented in [28] describe the learning of a parser to infer robot commands from natural language. ...
arXiv:1604.06721v1
fatcat:n6zfaafncrb35dhrx4vgpei7qa
On the performance evaluation of a vision-based human-robot interaction framework
2012
Proceedings of the Workshop on Performance Metrics for Intelligent Systems - PerMIS '12
Together, RoboChat and the dialog mechanism enable a human operator to send a series of complex instructions to a robot, with the assurance of confirmations in case of high task-cost or command uncertainty ...
The paper describes the details of the visual human-robot interaction framework, with an emphasis on the RoboChat language and the confirmation system, and presents a summary of the set of performance ...
The underlying language, called RoboChat, enables the user to program the robot to carry out a large variety of tasks, both simple and complex in nature. ...
doi:10.1145/2393091.2393096
dblp:conf/permis/SattarD12
fatcat:kpaa4bjkt5dfjjlox4xu5h2kgi
Interactive robot task training through dialog and demonstration
2007
Proceeding of the ACM/IEEE international conference on Human-robot interaction - HRI '07
We present a framework for interactive task training of a mobile robot where the robot learns how to do various tasks while observing a human. ...
In this paper, we describe the task training framework, describe how environmental context and communicative dialog with the human help the robot learn the task, and illustrate the utility of this approach ...
ACKNOWLEDGMENTS We would like to thank the Naval Research Labs for developing the NAUTILUS natural language processing system and for helping us understand how to use it. ...
doi:10.1145/1228716.1228724
dblp:conf/hri/RybskiYSV07
fatcat:4nzmohni3zdrrjoyapx4qkvw7m
Learning Task Knowledge from Dialog and Web Access
2015
Robotics
Author Contributions The first two authors contributed equally to this work.
Conflicts of Interest The authors declare no conflict of interest. Robotics 2015, 4 ...
the autonomous service mobile robots, and Mike Licitra for the design and construction of the always-functional and reliable CoBot robot platform. ...
human-robot dialog systems. ...
doi:10.3390/robotics4020223
fatcat:4yzwwl2sfndazkurawikmly2v4
Spatial Language for Human–Robot Dialogs
2004
IEEE Transactions on Systems Man and Cybernetics Part C (Applications and Reviews)
Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently ...
Report Documentation Page Form Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...
The authors would also like to acknowledge the help of Scott Thomas and Myriam Abramson at NRL and Dr. Pascal Matsakis and. Dr. Jim Keller at UMC. ...
doi:10.1109/tsmcc.2004.826273
fatcat:jnycgdkyira25datpmah5xla4a
Toward Human-Like Robot Learning
[chapter]
2018
Lecture Notes in Computer Science
This human-like learning can happen because the robot can extract, represent and reason over the meaning of the user's natural language utterances. ...
We present an implemented robotic system that learns elements of its semantic and episodic memory through language interaction with people. ...
The core prerequisite for human-like learning is the ability to automatically extract, represent and use the meaning of natural language texts -utterances, dialog turns, etc. ...
doi:10.1007/978-3-319-91947-8_8
fatcat:heg7okdwajdpzlqgaygx6r4ixe
Augmented Reality for Human-Robot Collaboration
[chapter]
2007
Human Robot Interaction
The result was natural human-robot spatial dialog enabling the robot to communicate obstacle locations relative to itself and receive verbal commands to move to or near an object it had detected. ...
(Breazeal, Edsinger et al. 2001) Robots with human social abilities, rich social interaction and natural communication will be able to learn from human counterparts through cooperation and tutelage. ...
For example, a significant research effort is being devoted to designing human-robot interface that makes it easier for the people to interact with robots. ...
doi:10.5772/5187
fatcat:d7e7swkbtrgblo5yw6xtgvqfem
Approaching the Symbol Grounding Problem with Probabilistic Graphical Models
2011
The AI Magazine
n order for robots to engage in dialog with human teammates, they must have the ability to map between words in the language and aspects of the external world. ...
A solution to this symbol grounding problem (Harnad, 1990) would enable a robot to interpret commands such as "Drive over to receiving and pick up the tire pallet." ...
Acknowledgments We would like to thank Dimitar Simeonov, Alejandro Perez, and Nick dePalma as well as the annotators on Amazon Mechanical Turk and the members of the Turker Nation forum. ...
doi:10.1609/aimag.v32i4.2384
fatcat:o52l2szpfvaxfkiphy5q7uv7x4
Cornell SPF: Cornell Semantic Parsing Framework
[article]
2016
arXiv
pre-print
The Cornell Semantic Parsing Framework (SPF) is a learning and inference framework for mapping natural language to formal representation of its meaning. ...
for Time Expressions (Lee et al., 2014) • Scalable Semantic Parsing with Partial Ontologies (Choi et al., 2015) • Learning to Interpret Natural Language Commands through Human-Robot Dialog (Thomason ...
The Cornell Semantic Parsing Framework (SPF) is a learning and inference framework for mapping natural language to formal representation of its meaning. ...
arXiv:1311.3011v2
fatcat:2wncc6nwjbawbe7e4scr5yuckm
Towards quantitative modeling of task confirmations in human-robot dialog
2011
2011 IEEE International Conference on Robotics and Automation
We test our system through human-interface experiments, based on a framework custom designed for our family of amphibious robots, and demonstrate the utility of the framework in the presence of large task ...
Specifically, this research aims to quantitatively model confirmation feedback, as required by a robot while communicating with a human operator to perform a particular task. ...
the interaction mechanism across a wider user population and a larger range of dialog models, across multiple robotic platforms, including terrestrial and aerial vehicles. ...
doi:10.1109/icra.2011.5979633
dblp:conf/icra/SattarD11
fatcat:ur7qdupbcfaohhuxid4gij7avu
« Previous
Showing results 1 — 15 out of 2,605 results