Filters








1,722 Hits in 6.5 sec

Enabling Intuitive Human-Robot Teaming Using Augmented Reality and Gesture Control [article]

Jason M. Gregory, Christopher Reardon, Kevin Lee, Geoffrey White, Ki Ng, Caitlyn Sims
2019 arXiv   pre-print
In this work, we present and analyze an augmented reality-enabled, gesture-based system that supports intuitive human-robot teaming through improved information exchange.  ...  However, one of the critical challenges in realizing an effective human-robot team is efficient information exchange - both from the human to the robot as well as from the robot to the human.  ...  In the subsequent sec- Figure 1 : System diagram of our proposed AR-enabled, gesture control system for intuitive human-robot teaming. tions, we first describe the hardware we found necessary for the  ... 
arXiv:1909.06415v1 fatcat:qqxezeqflrfnxcnu3pc6452lve

Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface

Scott A. Green, Xioa Qi Chen, Mark Billinghurst, J. Geoffrey Chase
2008 IFAC Proceedings Volumes  
Augmented Reality (AR) technology is used to facilitate natural use of gestures and provide a common 3D spatial reference for both the robot and human, thus providing a means for grounding of communication  ...  In our prototype the human communicates with the mobile robot using natural speech and gestures, for example, by selecting a point in 3D space and saying "go here" or "go behind that".  ...  Their approach used augmented reality overlays in a fixed work environment to enable the human 'director' to use spatial referencing to interactively plan and optimise the path of a robotic manipulator  ... 
doi:10.3182/20080706-5-kr-1001.02637 fatcat:anq3vd23wvgqzi7xmyu3dq2gza

Augmented Reality for Human-Robot Collaboration [chapter]

Scott A., Mark Billinghurst, XiaoQi Chen, J. Geoffrey
2007 Human Robot Interaction  
Its use of gestures, speech and eye contact enables the robot to effectively communicate with humans.  ...  The MCP will combine the speech from the speech processing module, the gesture information from the gesture-processing module and use the Human-Robot Collaboration Augmented Reality Environment (HRC-ARE  ...  September, 2007 Human-robot interaction research is diverse and covers a wide range of topics.  ... 
doi:10.5772/5187 fatcat:d7e7swkbtrgblo5yw6xtgvqfem

Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design

Scott A. Green, Mark Billinghurst, XiaoQi Chen, J. Geoffrey Chase
2008 International Journal of Advanced Robotic Systems  
This article reviews the field of human-robot interaction and augmented reality, investigates the potential avenues for creating natural human-robot collaboration through spatial dialogue utilizing AR  ...  However, little attention has been paid to joint human-robot teams. Making human-robot collaboration natural and efficient is crucial.  ...  Acknowledgements We would like to acknowledge the collaboration of Randy Stiles and Scott Richardson at the Lockheed Martin Space Systems Company, Sunnyvale California, USA.  ... 
doi:10.5772/5664 fatcat:pmkrifjifve27a2pdvz36rocwy

RoboTable: An Infrastructure for Intuitive Interaction with Mobile Robots in a Mixed-Reality Environment

Haipeng Mi, Aleksander Krzywinski, Tomoki Fujita, Masanori Sugimoto
2012 Advances in Human-Computer Interaction  
the possibility of using robots and intuitive interaction to enhance learning.  ...  With a flexible software toolkit and specifically developed robots, the platform enables various modes of interaction with mobile robots.  ...  The robot will track and move to follow the projected image's translation and rotation. This configuration scheme realized robot control and its augmentation using only one projector.  ... 
doi:10.1155/2012/301608 fatcat:hxsiehnfurfs5j6yznagtkgaj4

Evaluating the Augmented Reality Human-Robot Collaboration System

Scott A. Green, J. Geoffrey Chase, XiaoQi Chen, Mark Billinghurst
2008 2008 15th International Conference on Mechatronics and Machine Vision in Practice  
In contrast, the Augmented Reality Human-Robot Collaboration (AR-HRC) System interface enables the user to discuss and review a plan with the robot prior to execution.  ...  Hence, a multi-modal system has been developed that allows the remote human operator to view the robot in its work environment through an Augmented Reality (AR) interface.  ...  This interface is the Augmented Reality Human-Robot Collaboration System.  ... 
doi:10.1109/mmvip.2008.4749586 fatcat:hxq7yk57svd3bjb3jyjlajaqjq

Evaluating the augmented reality human-robot collaboration system

Scott A. Green, J. Geoffrey Chase, XiaoQi Chen, Mark Billinghurst
2010 International Journal of Intelligent Systems Technologies and Applications  
In contrast, the Augmented Reality Human-Robot Collaboration (AR-HRC) System interface enables the user to discuss and review a plan with the robot prior to execution.  ...  Hence, a multi-modal system has been developed that allows the remote human operator to view the robot in its work environment through an Augmented Reality (AR) interface.  ...  This interface is the Augmented Reality Human-Robot Collaboration System.  ... 
doi:10.1504/ijista.2010.030195 fatcat:juvoqwaltbdlndl7ujeogmg6aa

CobotAR: Interaction with Robots using Omnidirectionally Projected Image and DNN-based Gesture Recognition [article]

Nazarova Elena, Sautenkov Oleg, Altamirano Cabrera Miguel, Tirado Jonathan, Serpiva Valerii, Rakhmatulin Viktor, Tsetserukou Dzmitry
2021 arXiv   pre-print
The proposed technology suggests a novel way of interaction with machines to achieve safe, intuitive, and immersive control mediated by a robotic projection system and DNN-based algorithm.  ...  The system allows users to have a more intuitive experience with robotic applications using just their hands.  ...  ACKNOWLEDGMENT The reported study was funded by RFBR and CNRS according to the research project No. 21-58-15006.  ... 
arXiv:2110.10571v1 fatcat:lhnxlfy3nvg2xdemxcyx7ign4a

An Augmented Reality Human-Robot Collaboration System [chapter]

Scott A., J. Geoffrey, XiaoQi Chen, Mark Billinghurst
2009 Mobile Robots - State of the Art in Land, Sea, Air, and Collaborative Missions  
Gesture processing enables the human team members to use deictic referencing and normal gestures to communicate effectively with a robotic system.  ...  The MCP combines the speech from the speech processing module, the gesture information from the gesture-processing module and uses the Human-Robot Collaboration Augmented Reality Environment (HRC-ARE)  ... 
doi:10.5772/6997 fatcat:rxe52ax2uzerpmg4pazkl4mjqm

User Interaction Feedback in a Hand-Controlled Interface for Robot Team Tele-operation Using Wearable Augmented Reality [article]

Alberto Cannavò, Fabrizio Lamberti
2017 Smart Tools and Applications in Graphics  
In this work, an Augmented Reality (AR)-based interface is deployed on a head-mounted display to enable tele-operation of a remote robot team using hand movements and gestures.  ...  Continuous advancements in the field of robotics and its increasing spread across heterogeneous application scenarios make the development of ever more effective user interfaces for human-robot interaction  ...  In [CCG * 17], desktop AR is used to create a hand gesture-based interface for the control of robot team including a rover and a robotic arm.  ... 
doi:10.2312/stag.20171227 dblp:conf/egItaly/CannavoL17 fatcat:7xw27rperbcatevixs2oa3v6v4

Mixed-Granularity Human-Swarm Interaction [article]

Jayam Patel, Yicong Xu, Carlo Pinciroli
2019 arXiv   pre-print
We present an augmented reality human-swarm interface that combines two modalities of interaction: environment-oriented and robot-oriented.  ...  In this paper, we report a user study which indicates that, at least in collective transport, environment-oriented interaction is more effective than purely robot-oriented interaction, and that the two  ...  Using a tablet-based augmented reality application, the user can select the objects to transport and drag them to their intended destination.  ... 
arXiv:1901.08522v1 fatcat:3m2jpgzhkzg2bjoxqllzrvgb4e

Neural Network Based Lidar Gesture Recognition for Realtime Robot Teleoperation [article]

Simon Chamorro, Jack Collier, François Grondin
2021 arXiv   pre-print
We propose a novel low-complexity lidar gesture recognition system for mobile robot control robust to gesture variation.  ...  The lidar-based pose estimator and gesture classifier use data augmentation and automated labeling techniques, requiring a minimal amount of data collection and avoiding the need for manual labeling.  ...  Large-scale adoption of these systems into society is dependant, in part, on their ease of use. human-robot interaction (HRI) mechanisms must be intuitive, robust and efficient without imposing a heavy  ... 
arXiv:2109.08263v1 fatcat:hkep2ke4ifdetj3cx3iprgmugy

ARROCH: Augmented Reality for Robots Collaborating with a Human [article]

Kishan Chandan, Vidisha Kudalkar, Xiang Li, Shiqi Zhang
2022 arXiv   pre-print
Human-robot collaboration frequently requires extensive communication, e.g., using natural language and gestures.  ...  Augmented reality (AR) has provided an alternative way of bridging the communication gap between robots and people.  ...  CONCLUSIONS Leveraging augmented reality (AR) technologies, we introduce AR for robots collaborating with a human (AR-ROCH), a novel algorithm and system that enables bidirectional, multi-turn, beyond-proximity  ... 
arXiv:2109.10400v2 fatcat:5mfvlsp3mvdhrohhzaqusifimy

The Design and Evaluation of an Ergonomic Contactless Gesture Control System for Industrial Robots

Gilbert Tang, Phil Webb
2018 Journal of Robotics  
The novelties of this research are the use of human factor analysis tools in the human-centred development process, as well as the gesture control design that enable users to control industrial robot's  ...  The system has potential to use as an input device for industrial robot control in a human-robot collaboration scene.  ...  Acknowledgments The authors would like to thank Teegan Johnson who provided practical advice on the use of RULA assessment that greatly helped the research.  ... 
doi:10.1155/2018/9791286 fatcat:jvxywn3ca5ey3gehsrha5htwvm

Spoken language and multimodal applications for electronic realities

A. Cheyer, L. Julia
1999 Virtual Reality  
Every day, people interact with each other, with pets, and sometimes with physical objects by using a combination of expressive modalities, such as spoken words, tone of voice, pointing and gesturing,  ...  We use the term 'electronic reality' (ER) to encompass a broad class of concepts that mix real-world metaphors and computer interfaces.  ...  Recently, we have been working on constructing a wearable user interface that will enable a human to work with the robots as part of the team [36] .  ... 
doi:10.1007/bf01408590 fatcat:7pmli3y4bndzjobcnvtlqv3iay
« Previous Showing results 1 — 15 out of 1,722 results