Explorations in the Use of Augmented Reality for Geographic Visualization

Nicholas R. Hedley, Mark Billinghurst, Lori Postner, Richard May, Hirokazu Kato
2002 Presence - Teleoperators and Virtual Environments  
In this paper we describe two explorations in the use of hybrid user interfaces for collaborative geographic data visualization. Our first interface combines three technologies; Augmented Reality (AR), immersive Virtual Reality and computer vision based hand and object tracking. Wearing a lightweight display with camera attached, users can look at a real map and see three-dimensional virtual terrain models overlaid on the map. From this AR interface they can fly in and experience the model
more » ... sively, or use free hand gestures or physical markers to change the data representation. Building on this work, our second interface explores alternative interface techniques, including a zoomable user interface, paddle interactions and pen annotations. We describe the system hardware and software, and the implications for GIS and spatial science applications. As computers become more ubiquitous they are increasing being used to support remote collaboration. Adding a camera and microphone to a computer converts it into a powerful tool for remote teleconferencing while software such as Microsoft's NetMeeting allows users to share applications and effortlessly exchange data. Although computer supported remote work is common, there are fewer interfaces for supporting face-to-face collaboration. This is particularly true for viewing and interacting with three dimensional data-sets and models. In this setting it is particularly important for the interface to provide intuitive viewing and manipulation of spatial models while at the same time supporting the normal conversational cues of a face-to-face meeting. Augmented Reality (AR) is one particularly promising technology for face-to-face collaboration on spatial tasks. AR superimposes three-dimensional virtual images on the real world so users can see each other, the surrounding real environment, and the virtual models in their midst. Our previous research has found that users can collaborate more effectively in an AR setting than on the same task in an immersive VR interface [3], and that AR interfaces can enhance the cues already present in face-to-face collaboration [4]. One of the greatest benefits of AR interfaces is that they can be integrated into the existing workplace and combined with other more traditional interface technology. For example, the EMMIE system is a hybrid user interface that synergistically merges information in an AR headset with data shown on monitor and projection displays [6]. Within EMMIE users can seamlessly move virtual objects from the being overlaid on real world to being placed on a desktop monitor. Despite this seamlessness there are few examples of interfaces that combine AR techniques with other technologies. In this paper we report on hybrid interfaces that combine technologies such as Augmented Reality, immersive Virtual Reality and computer-vision methods for natural user input. Our prototype interfaces are designed to allow multiple users to view and interact with geo-spatial data; however the techniques we describe could be applied to many different application domains. In fact, it is our hope that this work will motivate others to consider how AR technologies can be used in hybrid user interfaces for their own applications. In the remainder of this work we first describe the application area in more depth, then each of the technologies used in our systems and the prototype hybrid interfaces developed around these technologies. We briefly present user experiences with
doi:10.1162/1054746021470577 fatcat:mkkrx3tn5vg6vi2ztutmjyrm3m