Type-hover-swipe in 96 bytes
Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI '14
Figure 1 . We present a novel mechanical keyboard combining motion gestures on and directly above the keys with regular tactile typing. A lowresolution but high-speed sensor (A) is embedded into an off-the-shelf keyboard (B). IR proximity sensors are interspersed between the keycaps (C). This results in a low-resolution raw intensity image when hands are interacting above (D). A sequence of these images are accumulated into proximity (E) and motion (F) history images. Together these form a
... r these form a motion signature (E+F) which can be used to robustly recognize a number of dynamic on-keyboard (G) and hover gestures (H) using a machine learning-based classifier. ABSTRACT We present a new type of augmented mechanical keyboard, sensing rich and expressive motion gestures performed both on and directly above the device. A low-resolution matrix of infrared (IR) proximity sensors is interspersed with the keys of a regular mechanical keyboard. This results in coarse but high frame-rate motion data. We extend a machine learning algorithm, traditionally used for static classification only, to robustly support dynamic, temporal gestures. We propose the use of motion signatures a technique that utilizes pairs of motion history images and a random forest classifier to robustly recognize a large set of motion gestures. Our technique achieves a mean per-frame classification accuracy of 75.6% in leave-one-subject-out and 89.9% in half-test/half-training cross-validation. We detail hardware and gesture recognition algorithm, provide accuracy results, and demonstrate a large set of gestures designed to be performed with the device. We conclude with qualitative feedback from users, discussion of limitations and areas for future work.