Embodied Axes: Tangible, Actuated Interaction for 3D Augmented Reality Data Spaces

Maxime Cordeil, Benjamin Bach, Andrew Cunningham, Bastian Montoya, Ross T. Smith, Bruce H. Thomas, Tim Dwyer
2020 Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems  
Figure 1 : Embodied Axes is built with off the shelf electronics components which comprise of actuated linear potentiometers and rotary buttons. A serial communication sends the sensors' values to the computer, and read commands from the computer to move the sliders to given positions. An AR headset displays in place an immersive 3D visualisation inside the 3D space of the Embodied Axes. These views are captured from the point of view of a Meta 2 headset. ABSTRACT We present Embodied Axes, a
more » ... troller which supports selection operations for 3D imagery and data visualisations in Augmented Reality. The device is an embodied representation of a 3D data space -each of its three orthogonal arms corresponds to a data axis or domain specific frame of reference. Each axis is composed of a pair of tangible, actuated range sliders for precise data selection, and rotary encoding knobs for additional parameter tuning or menu navigation. The motor actuated sliders support alignment to positions of significant values within the data, or coordination with other input: e.g., mid-air gestures in the data space, touch gestures on the surface below the data, or another Embodied Axes device supporting multi-user scenarios. We conducted expert enquiries in medical imaging which provided formative feedback on domain tasks and refinements to the design. Additionally, a controlled user study was performed and found that the Embodied Axes was overall more accurate than conventional tracked controllers for selection tasks.
doi:10.1145/3313831.3376613 dblp:conf/chi/CordeilBCMSTD20 fatcat:h5kwsd7hkrc6dp2t4szszcjxtm