Filters








1,045 Hits in 8.3 sec

Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes

Jacob O. Wobbrock, Andrew D. Wilson, Yang Li
2007 Proceedings of the 20th annual ACM symposium on User interface software and technology - UIST '07  
Although some user interface libraries and toolkits offer gesture recognizers, such infrastructure is often unavailable in design-oriented environments like Flash, scripting environments like JavaScript  ...  In addition, we found that medium-speed gestures, in which users balanced speed and accuracy, were recognized better than slow or fast gestures for all three recognizers.  ...  Prior work has attempted to provide gesture recognition for user interfaces through the use of libraries and toolkits [6, 8, 12, 17] .  ... 
doi:10.1145/1294211.1294238 dblp:conf/uist/WobbrockWL07 fatcat:26lqaazkwzebfewo62m6gyjbqy

A $3 gesture recognizer

Sven Kratz, Michael Rohs
2010 Proceedings of the 15th international conference on Intelligent user interfaces - IUI '10  
A user evaluation of our system resulted in a correct gesture recognition rate of 80%, when using a set of 10 unique gestures for classification.  ...  We present the $3 Gesture Recognizer, a simple but robust gesture recognition system for input devices featuring 3D acceleration sensors.  ...  An example application area for our gesture recognizer is user interface prototyping.  ... 
doi:10.1145/1719970.1720026 dblp:conf/iui/KratzR10 fatcat:4h2qqjh7znd5bbgtu3ug7nwhoa

Training Motor Skills Using Haptic Interfaces [chapter]

Otniel Portillo-Rodriguez, Carlo Avizzano, Oscar Sandoval, Adriana Vilchis-Gonzalez, Mariel Davila-Vilchis, Massimo Bergamasco
2012 Haptics Rendering and Applications  
users and develop advanced algorithms that recognize the difference between drawn a primitive element as expert or novice.  ...  by means of a static library.  ... 
doi:10.5772/25676 fatcat:d2ejhethgfdufh6ewju3dqida4

WatchConnect

Steven Houben, Nicolai Marquardt
2015 Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15  
WatchConnect toolkit consists of (a) wired prototyping smartwatches with sensors through a (b) flexible and extendable hardware layer, (c) a software development platform providing user interface components  ...  To address this problem, we introduce WatchConnect, a toolkit for rapidly prototyping cross-device applications and interaction techniques with smartwatches.  ...  Thanks to Michael Nebeling, Sarah Gallacher and our anonymous reviewers for their feedback and helpful suggestions for the improvement of the manuscript.  ... 
doi:10.1145/2702123.2702215 dblp:conf/chi/HoubenM15 fatcat:b6lrmqlzdbbn3bkfr5wwkcikqi

PyGmI

Matthias Schwaller, Denis Lalanne, Omar Abou Khaled
2010 Proceedings of the 6th Nordic Conference on Human-Computer Interaction Extending Boundaries - NordiCHI '10  
The Portable Gestural Interface PyGmI, which we implemented, is a smart tool to interact with a system via simple hand gestures.  ...  The user wears some color markers on his fingers and a webcam on his chest.  ...  [3] developed the system ShelfTorchlight which helps users in search for book in a library or a product in the supermarket. The prototype uses a mobile phone and a mobile projector.  ... 
doi:10.1145/1868914.1869026 dblp:conf/nordichi/SchwallerLK10 fatcat:hxgggmlz3jfwfitfqjkrqldoju

LipiTk

Sriganesh Madhvanath, Deepu Vijayasenan, Thanigai Murugan Kadiresan
2007 ACM SIGGRAPH 2007 courses on - SIGGRAPH '07  
This paper describes Lipi Toolkit (LipiTk) -a generic toolkit whose aim is to facilitate development of online handwriting recognition engines for new scripts, and simplify integration of the resulting  ...  The toolkit provides robust implementations of tools, algorithms, scripts and sample code necessary to support the activities of handwriting data collection and annotation, training and evaluation of recognizers  ...  Jagannadan and A. Bharath from the Pen-based Solutions and Handwriting Recognition team at HP Labs India, for their contributions towards the toolkit.  ... 
doi:10.1145/1281500.1281524 dblp:conf/siggraph/MadhvanathVK07 fatcat:qjg6uearyzdfvjvpyy7tiwkppu

An Enhanced Training- Based Arabic Sign Language Virtual Interpreter Using Parallel Recurrent Neural Networks

Mohamed A. Abdou
2018 Journal of Computer Science  
The proposed system uses a deep neural network training-based system for ASL that convolves RNN and Graphical Processing Unit (GPU) parallel processors.  ...  The signing avatar is highly encouraged as a simulator for natural human signs.  ...  Special thanks to Egyptian NGOs supporting Deaf for their help in evaluation process. Ethics The author confirm that this article content has no conflict of interest.  ... 
doi:10.3844/jcssp.2018.228.237 fatcat:szpxcq42qvfb7d3kgdr5vcgr3e

Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects into Tangible 6DoF Interfaces Using Ad hoc UI

Ruofei Du, Alex Olwal, Mathieu Le Goc, Shengzhi Wu, Danhang Tang, Yinda Zhang, Jun Zhang, David Joseph Tan, Federico Tombari, David Kim
2022 CHI Conference on Human Factors in Computing Systems Extended Abstracts  
Real-time environmental tracking has become a fundamental capability in modern mobile phones and AR/VR devices. However, it only allows user interfaces to be anchored at a static location.  ...  Although fiducial and natural-feature tracking overlays interfaces with specific visual features, they typically require developers to define the pattern before deployment. In this paper, we introduce  ...  As a proof-of-concept, we have developed Ad hoc UI (AhUI), a prototype toolkit to empower users to turn everyday objects into opportunistic interfaces on the fly.  ... 
doi:10.1145/3491101.3519911 fatcat:7xk65l5f3faghprkgej5dc7c34

Tactical Language and Culture Training Systems: Using AI to Teach Foreign Languages and Cultures

W. Lewis Johnson, Andre Valente
2009 The AI Magazine  
The Tactical Language and Culture Training System (TLCTS) helps people quickly acquire communicative skills in foreign languages and cultures.  ...  user interface polish, and so on.  ...  The system provides mechanisms for an instructor or training manager to create and configure users and groups of users and produce progress reports.  ... 
doi:10.1609/aimag.v30i2.2240 fatcat:rog4z2bspnazzozqguule3gmfy

Hand Gesture Recognition System Based in Computer Vision and Machine Learning [chapter]

Paulo Trigueiros, Fernando Ribeiro, Luís Paulo Reis
2015 Lecture Notes in Computational Vision and Biomechanics  
Although the implemented prototype was only trained to recognize the vowels, it is easily extended to recognize the rest of the alphabet, being a solid foundation for the development of any vision-based  ...  This is an area with many different possible applications, giving users a simpler and more natural way to communicate with robots/systems interfaces, without the need for extra devices.  ...  Also special thanks to the Polytechnic Institute of Porto, the ALGORITMI Research Centre and the LIACC Research Center, for the opportunity to develop this research work.  ... 
doi:10.1007/978-3-319-13407-9_21 fatcat:yb4a3zsx25g3zdh6pmisqk7kku

Generic System for Human-Computer Gesture Interaction: Applications on Sign Language Recognition and Robotic Soccer Refereeing

Paulo Trigueiros, Fernando Ribeiro, Luis Paulo Reis
2015 Journal of Intelligent and Robotic Systems  
For dynamic gestures, an HMM (Hidden Markov Model) model was trained for each gesture that the system could recognize with a final average accuracy of 93.7%.  ...  The proposed solution is mainly composed of three modules: a pre-processing and hand segmentation module, a static gesture interface module and a dynamic gesture interface module.  ...  For SVM Dlib library was used, a general SVM multiclass classification fast, able to be applied in Fig. 7 one can see classified and displayed on the right side of the user interface.  ... 
doi:10.1007/s10846-015-0192-4 fatcat:liamnppxzfc2zmagoycprekwsi

Protractor3D

Sven Kratz, Michael Rohs
2011 Proceedings of the 15th international conference on Intelligent user interfaces - IUI '11  
Protractor 3D is a gesture recognizer that extends the 2D touch screen gesture recognizer Protractor [8] to 3D gestures.  ...  It uses a nearest neighbor approach to classify input gestures. It is thus well-suited for application in resource-constrained mobile devices.  ...  Protractor 3D requires only a low number of training gestures and thus imposes a low overhead in design and prototyping phases of gesture-based interfaces, in which designers (or later also users) wish  ... 
doi:10.1145/1943403.1943468 dblp:conf/iui/KratzR11 fatcat:24vl7233ybc6pnuaeq5frlizru

Form Follows Sound

Baptiste Caramiaux, Alessandro Altavilla, Scott G. Pobiner, Atau Tanaka
2015 Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15  
We describe a series of workshops, called Form Follows Sound, where participants ideate imagined sonic interactions, and then realize working interactive sound prototypes.  ...  Sonic interaction is the continuous relationship between user actions and sound, mediated by some technology.  ...  Gestural Sound Toolkit We provided a hardware/software toolkit to realize gestural sound interactions (Figure 1 ).  ... 
doi:10.1145/2702123.2702515 dblp:conf/chi/CaramiauxAPT15 fatcat:mha77h5aqzecvpvrvrk2ffhsbq

Eyepatch

Dan Maynes-Aminzade, Terry Winograd, Takeo Igarashi
2007 Proceedings of the 20th annual ACM symposium on User interface software and technology - UIST '07  
In an effort to learn what makes a useful computer vision design tool, we created Eyepatch, a tool for designing camera-based interactions, and evaluated the Eyepatch prototype through deployment to students  ...  Cameras are a useful source of input for many interactive applications, but computer vision programming is difficult and requires specialized knowledge that is out of reach for many HCI practitioners.  ...  ACKNOWLEDGMENTS We thank the students of CS377S for their creative project ideas, their valuable design suggestions, and their helpful bug reports.  ... 
doi:10.1145/1294211.1294219 dblp:conf/uist/Maynes-AminzadeWI07 fatcat:kboakun6qrg2tjhpicwjkd7boy

A language to define multi-touch interactions

Shahedul Huq Khandkar, Frank Maurer
2010 ACM International Conference on Interactive Tabletops and Surfaces - ITS '10  
Touch has become a common interface for human computer interaction.  ...  Incorporating a touch interface in application requires translating meaningful touches into system recognizable events.  ...  O., Wilson, A. D and Li, Y. Gestures without libraries, toolkits or training: a 1 recognizer for user interface prototypes. UIST '07 4. Kratz, S., and Rohs, M.  ... 
doi:10.1145/1936652.1936710 dblp:conf/tabletop/KhandkarM10 fatcat:4ijwo55difbflcygdmlebspnc4
« Previous Showing results 1 — 15 out of 1,045 results