A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2006; you can also visit the original URL.
The file type is
Proceedings of the 13th annual ACM symposium on User interface software and technology - UIST '00
We have implemented a computer interface that renders synchronized auditory and haptic stimuli with very low (0.5ms) latency. The audio and haptic interface (AHI) includes a Pantograph haptic device that reads position input from a user and renders force output based on this input. We synthesize audio by convolving the force profile generated by user interaction with the impulse response of the virtual surface. Auditory and haptic modes are tightly coupled because we produce both stimuli fromdoi:10.1145/354401.354437 dblp:conf/uist/DiFilippoP00 fatcat:wpdowyb4rvfixbr4flt7mrlwcq