A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Real-time auditory-visual distance rendering for a virtual reaching task
2007
Proceedings of the 2007 ACM symposium on Virtual reality software and technology - VRST '07
This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance.
doi:10.1145/1315184.1315217
dblp:conf/vrst/MionAMBS07
fatcat:lde45asppjblpmgj2zhhlt76oq