Filters








12 Hits in 4.9 sec

TOKCS: Tool for Organizing Key Characteristics of VAM-HRI Systems [article]

Thomas R. Groechel, Michael E. Walker, Christine T. Chang, Eric Rosen, Jessica Zosa Forde
2021 arXiv   pre-print
Frameworks have begun to emerge to categorize Virtual, Augmented, and Mixed Reality (VAM) technologies that provide immersive, intuitive interfaces to facilitate Human-Robot Interaction.  ...  To showcase the tool's capability, TOKCS is applied to the ten papers from the fourth VAM-HRI workshop and examined for key trends and takeaways.  ...  Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of Draper.  ... 
arXiv:2108.03477v2 fatcat:ubfq2dx7ifao7emzilreirem7a

The 1st International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction

Tom Williams, Daniel Szafir, Tathagata Chakraborti, Heni Ben Amor
2018 The AI Magazine  
The 1st International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) was held in 2018 in conjunction with the 13th International Conference on Human-Robot Interaction  ...  , and brought together researchers from the fields of Human-Robot Interaction (HRI), Robotics, Artificial Intelligence, and Virtual, Augmented, and Mixed Reality in order to identify challenges in mixed  ...  n The first International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) was held in 2018 in conjunction with the 13th International Conference on Human-Robot Interaction  ... 
doi:10.1609/aimag.v39i4.2822 fatcat:y3x47u6cqffjbc46tpacvxn7lu

Extended Reality in Robotics [From the Guest Editors]

Elena De Momi, Mahdi Tavakoli, Jeffrey Delmerico, Antonio Frisoli, Mark A. Minor, Giovanni Rossini, Paul Chippendale
2022 IEEE robotics & automation magazine  
In the third article, Groechel et al. apply the tool for organizing key characteristics of virtual, augmented, and mixed reality Technologies in humanrobot interaction (VAM-HRI) framework to several papers  ...  This special issue explores the recent advances in AR, VR, and MR for human-robot interaction (HRI) in the field of robotics.  ... 
doi:10.1109/mra.2022.3143186 fatcat:uvyazoywurap5ghjlomuekksu4

Augmented Reality Appendages for Robots: Design Considerations and Recommendations for Maximizing Social and Functional Perception [article]

Ipek Goktan, Karen Ly, Thomas R. Groechel, Maja J. Mataric
2022 arXiv   pre-print
In order to address the limitations of gestural capabilities in physical robots, researchers in Virtual, Augmented, Mixed Reality Human-Robot Interaction (VAM-HRI) have been using augmented-reality visualizations  ...  The proposed recommendations provide the VAM-HRI community with starting points for selecting appropriate gesture types for a multitude of interaction contexts.  ...  INTRODUCTION To address the limitations of gestural capabilities in physical robots, researchers in the field of Virtual, Augmented, Mixed Reality Human-Robot Interaction (VAM-HRI) have been using augmented  ... 
arXiv:2205.06747v1 fatcat:z5ngpdk6lncfvn37wphs2gatgy

Augmented, Mixed, and Virtual Reality Enabling of Robot Deixis [chapter]

Tom Williams, Nhan Tran, Josh Rands, Neil T. Dantam
2018 Lecture Notes in Computer Science  
Recent work in augmented, mixed, and virtual reality stands to enable enormous advances in robot deixis, both by allowing robots to gesture in ways that were not previously feasible, and by enabling gesture  ...  In this paper, we summarize our own recent work on using augmented, mixed, and virtual-reality techniques to advance the state-of-the-art of robot-generated deixis.  ...  In March 2018, the first international workshop on Virtual, Augmented, and Mixed-Reality for Human-Robot Interaction (VAM-HRI) was held at the 2018 international conference on Human-Robot Interaction (  ... 
doi:10.1007/978-3-319-91581-4_19 fatcat:lomkhl32hrfozcaavtk5qo3c7y

Augmented Reality for Robotics: A Review

Zhanat Makhataeva, Huseyin Atakan Varol
2020 Robotics  
and control: trajectory generation, robot programming, simulation, and manipulation; (3) Human-robot interaction (HRI): teleoperation, collaborative interfaces, wearable robots, haptic interfaces, brain-computer  ...  Augmented reality (AR) is used to enhance the perception of the real world by integrating virtual objects to an image sequence acquired from various camera technologies.  ...  [111] , the first workshop on "Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI)" brought together works where AR, VR, and MR were integrated with different robotic systems  ... 
doi:10.3390/robotics9020021 doaj:a4cff00359374fcf99f0aa2b2771bc3a fatcat:upn3jaifw5barbodr5c2n73ucq

Learning robot motor skills with mixed reality [article]

Eric Rosen, Sreehari Rammohan, Devesh Jha
2022 arXiv   pre-print
Mixed Reality (MR) has recently shown great success as an intuitive interface for enabling end-users to teach robots.  ...  Related works have used MR interfaces to communicate robot intents and beliefs to a co-located human, as well as developed algorithms for taking multi-modal human input and learning complex motor behaviors  ...  ACKNOWLEDGEMENTS We express appreciation for the support from the Humans To Robots Laboratory and the Intelligent Robot Lab at Brown University for supplying our research with robotic and MR platforms.  ... 
arXiv:2203.11324v1 fatcat:6s5towvgofbubc4ckiaxwoz6ga

Crisis Ahead? Why Human-Robot Interaction User Studies May Have Replicability Problems and Directions for Improvement

Benedikt Leichtmann, Verena Nitsch, Martina Mara
2022 Frontiers in Robotics and AI  
While human-robot interaction (HRI) is an interdisciplinary research field, the study of human behavior, cognition and emotion in HRI plays also a vital part.  ...  This article aims to provide a basis for further discussion and a potential outline for improvements in the field.  ...  Workshop on Virtual, Augmented, and Mixed-Reality for Human-Robot Interaction (VAM-HRI 2021) . Mara, M., Stein, J.-P., Latoschik, M. E., Lugrin, B., Schreiner, C., Hostettler, R., et al. (2021b).  ... 
doi:10.3389/frobt.2022.838116 pmid:35360497 pmcid:PMC8961736 fatcat:oxof3xidjzcfxk2r5qlvzx3xbu

Demonstrating and Learning Multimodal Socio-communicative Behaviors for HRI: Building Interactive Models from Immersive Teleoperation Data

Gérard Bailly, Frédéric Elisei
2018 FAIM/ISCA Workshop on Artificial Intelligence for Multimodal Human Robot Interaction   unpublished
We finally argue for establishing stronger gateways between HRI and Augmented/Virtual Reality research domains.  ...  Collecting and modeling multimodal interactive data is thus a major issue for fostering AI for HRI.  ...  HRI and virtual reality The area of Virtual, Augmented and Mixed Reality (VAMR) interactions between humans and robots -considering not only robots as a way to augment reality but also ways to perceive  ... 
doi:10.21437/ai-mhri.2018-10 fatcat:4dbp6fw5bjh6xfxnzdecuwbocm

Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces [article]

Ryo Suzuki, Adnan Karim, Tian Xia, Hooman Hedayati, Nicolai Marquardt
2022 pre-print
Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces).  ...  Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots.  ...  We also searched for synonyms of each keyword, such as "mixed reality", "AR", "MR" for augmented reality and "robotic", "actuated", "shapechanging" for robot.  ... 
doi:10.1145/3491102.3517719 arXiv:2203.03254v1 fatcat:6k7i4gvxyvgtvipdwh3lctcenq

A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data

Samadiani, Huang, Cai, Luo, Chi, Xiang, He
2019 Sensors  
Facial Expression Recognition (FER) can be widely applied to various research areas, such as mental diseases diagnosis and human social/physiological interaction detection.  ...  We briefly introduce the benchmark data sets related to FER systems for each category of sensors and extend our survey to the open challenges and issues.  ...  Many technologies such as virtual reality (VR) [9] and augmented reality (AR) [10] employ a robust FER to implement a natural, friendly communication with humans.  ... 
doi:10.3390/s19081863 fatcat:bqsx53jtwvf23cs6ykpfi7qqga

Dagstuhl Reports, Volume 8, Issue 6, June 2018, Complete Issue [article]

2019
We proceed with another example dataset which would benefit from metadata: jupyter notebooks, for which we envision the following uses: Publish along with research papers to encourage other researchers  ...  We benchmark PASTE using Write-Ahead Logging and B+tree, as well as porting it to key value stores and software switch, and show PASTE significantly outperforms well-tuned Linux and the state-of-the art  ...  Improving Human-Robot Handover Research by Mixed Reality Techniques. VAM-HRI 2018.  ... 
doi:10.4230/dagrep.8.6 fatcat:3ssdbne26vbafli4wg4udjocly