A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
Filters
Bare-hand human-computer interaction
2001
Proceedings of the 2001 workshop on Percetive user interfaces - PUI '01
Finger tracking and hand posture recognition are used to paint virtually onto the wall, to control a presentation with hand postures, and to move virtual items on the wall during a brainstorming session ...
Barehanded means that no device and no wires are attached to the user, who controls the computer directly with the movements of his/her hand. Our approach is centered on the needs of the user. ...
Labtec's Spaceball) and magnetic tracking devices (e.g. Polhemus' Isotrack). ...
doi:10.1145/971478.971513
fatcat:nnyqa3uunnfmtd2z55ccvn7v3a
Visual touchpad
2004
Proceedings of the 6th international conference on Multimodal interfaces - ICMI '04
Two downward-pointing cameras are attached above a planar surface, and a stereo hand tracking system provides the 3D positions of a user's fingertips on and above the plane. ...
This paper presents the Visual Touchpad, a low-cost vision-based input device that allows for fluid two-handed interactions with desktop PCs, laptops, public kiosks, or large wall displays. ...
ACKNOWLEDGEMENTS We thank Ravin Balakrishnan, Allan Jepson, and Abhishek Ranjan from the University of Toronto for valuable discussions. ...
doi:10.1145/1027933.1027980
dblp:conf/icmi/MalikL04
fatcat:a5xcrbks3bag5ayvv3opjxqx4a
Interacting with Projected Media on Deformable Surfaces
2007
2007 IEEE 11th International Conference on Computer Vision
We discuss its design and implementation within the context of a radically different approach for controlling home appliances by pressing virtual buttons that are projected on soft deformable surfaces ...
such as a sofa pillow. ...
Hand gestures may also be used as a selection action [15, 16] . However, the task of hand segmentation in unconstrained environment and poor lighting conditions is difficult. Freeman et al. ...
doi:10.1109/iccv.2007.4409125
dblp:conf/iccv/FitrianiG07
fatcat:bi37un7tz5czngmc7eakuio22e
TouchLight
2005
ACM SIGGRAPH 2005 Emerging technologies on - SIGGRAPH '05
Image processing techniques are detailed, and several novel capabilities of the system are outlined. ...
A novel touch screen technology is presented. ...
Svoboda, Fingermouse: A Wearable Hand Tracking System. in Ubicomp 2003: Ubiquitous Computing, (2002). ...
doi:10.1145/1187297.1187323
dblp:conf/siggraph/Wilson05
fatcat:xquxakjydvbvrkgookyfkc7yka
Image processing techniques are detailed, and several novel capabilities of the system are outlined. ...
A novel touch screen technology is presented. ...
Svoboda, Fingermouse: A Wearable Hand Tracking System. in Ubicomp 2003: Ubiquitous Computing, (2002). ...
doi:10.1145/1027933.1027946
dblp:conf/icmi/Wilson04
fatcat:vhulrsqvtbeu5itsyfniwrqi4q
Towards implicit interaction by using wearable interaction device sensors for more than one task
2006
Proceedings of the 3rd international conference on Mobile technology, applications & systems - Mobility '06
User interaction and context awareness are key issues in today's wearable computing research. Context-aware wearable devices typically use a multitude of special purpose sensors for this. ...
In this paper, we show for a simple test case that even simple sensors in a wearable human computer interaction device can be used for robust context detection. ...
To achieve this functionality a tilt-sensor and different micro-click buttons are used. GestureWrist [14] is a wrist-watch type input device. ...
doi:10.1145/1292331.1292354
dblp:conf/mobility/WittK06
fatcat:idewefmey5hrvgzbqhq7zt3daa
The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface
2007
2007 11th IEEE International Symposium on Wearable Computers
We introduce the Gesture Watch, a mobile wireless device worn on a user's wrist that allows hand gesture control of other devices. ...
The Gesture Watch utilizes an array of infrared proximity sensors to sense hand gestures made over the device and interprets the gestures using hidden Markov models. ...
The device needs to be large enough so that a user can press physical buttons or move a finger around a touch sensitive surface. ...
doi:10.1109/iswc.2007.4373770
dblp:conf/iswc/KimHLS07
fatcat:mmjfadofljh7tom2roos6f2um4
Visual interpretation of hand gestures for human-computer interaction: a review
1997
IEEE Transactions on Pattern Analysis and Machine Intelligence
In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. ...
This has motivated a very active research area concerned with computer vision-based analysis and interpretation of hand gestures. ...
The authors would like to acknowledge Yusuf Azoz and Lalitha Devi for their help with the references, and Karin Pavlovic for her help in reviewing the manuscript. ...
doi:10.1109/34.598226
fatcat:odvqyyqr7vbzjcmlak4m5j5smy