Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes

Jacob O. Wobbrock, Andrew D. Wilson, Yang Li
2007 Proceedings of the 20th annual ACM symposium on User interface software and technology - UIST '07  
Although mobile, tablet, large display, and tabletop computers increasingly present opportunities for using pen, finger, and wand gestures in user interfaces, implementing gesture recognition largely has been the privilege of pattern matching experts, not user interface prototypers. Although some user interface libraries and toolkits offer gesture recognizers, such infrastructure is often unavailable in design-oriented environments like Flash, scripting environments like JavaScript, or brand
more » ... off-desktop prototyping environments. To enable novice programmers to incorporate gestures into their UI prototypes, we present a "$1 recognizer" that is easy, cheap, and usable almost anywhere in about 100 lines of code. In a study comparing our $1 recognizer, Dynamic Time Warping, and the Rubine classifier on user-supplied gestures, we found that $1 obtains over 97% accuracy with only 1 loaded template and 99% accuracy with 3+ loaded templates. These results were nearly identical to DTW and superior to Rubine. In addition, we found that medium-speed gestures, in which users balanced speed and accuracy, were recognized better than slow or fast gestures for all three recognizers. We also discuss the effect that the number of templates or training examples has on recognition, the score falloff along recognizers' N-best lists, and results for individual gestures. We include detailed pseudocode of the $1 recognizer to aid development, inspection, extension, and testing. Figure 1. Unistroke gestures useful for making selections, executing commands, or entering symbols. This set of 16 was used in our study of $1, DTW [18,28], and Rubine [23]. 1
doi:10.1145/1294211.1294238 dblp:conf/uist/WobbrockWL07 fatcat:26lqaazkwzebfewo62m6gyjbqy