Filters








8 Hits in 2.9 sec

Manual deskterity

Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, Bill Buxton
2010 Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems - CHI EA '10  
Manual Deskterity is a prototype digital drafting table that supports both pen and touch input.  ...  We also explore the simultaneous use of pen and touch to support novel compound gestures.  ...  Related Work Many current direct input systems employ only one of touch or pen input.  ... 
doi:10.1145/1753846.1753865 dblp:conf/chi/HinckleyYPCRWBB10 fatcat:erwsvfzrnzchfcwxwyd7jwd5gy

Enabling tangible interaction on capacitive touch panels

Neng-Hao Yu, Li-Wei Chan, Lung-Pan Cheng, Mike Y. Chen, Yi-Ping Hung
2010 Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10  
The first combines simultaneous tangible + touch input system. This explores how tangible inputs (e.g., pen, easer, etc.) and some simple gestures work together on capacitive touch panels.  ...  We propose two approaches to sense tangible objects on capacitive touch screens, which are used in off-the-shelf multi-touch devices such as Apple iPad, iPhone, and 3M's multi-touch displays .  ...  We plan to explore interaction techniques, such as those from "Manual Deskterity" [2], based on using Simultaneous TUI + Touch Direct Input on capacitive touch panel.  ... 
doi:10.1145/1866218.1866269 dblp:conf/uist/YuCCCH10 fatcat:ogrlkety6rcqpnyjpjasri7uvy

Conté

Daniel Vogel, Géry Casiez
2011 Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11  
Conté also has a natural compatibility with multi-touch input: it can be tucked in the palm to interleave same-hand touch input, or used to expand the vocabulary of bimanual touch.  ...  Conté's rectangular prism shape enables both precise pen-like input and tangible handle interaction.  ...  Conté can also realize the "pen + touch = new tools" design philosophy of Manual Deskterity.  ... 
doi:10.1145/2047196.2047242 dblp:conf/uist/VogelC11 fatcat:vq7nnm5oibdplcyfamnt2jajfm

Pen + touch = new tools

Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, Bill Buxton
2010 Proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10  
We describe techniques for direct pen+touch input. We observe people's manual behaviors with physical paper and notebooks.  ...  Based on our explorations we advocate a division of labor between pen and touch: the pen writes, touch manipulates, and the combination of pen + touch yields new tools.  ...  Indeed, our exploration of Manual Deskterity convinces us that if each input modality offers complete coverage of all interactions, it quickly robs the combination of pen and touch of much of its vigor  ... 
doi:10.1145/1866029.1866036 dblp:conf/uist/HinckleyYPCRWBB10 fatcat:exge3xsknbgrfjx26536lpyjaq

TUIC

Neng-Hao Yu, Polly Huang, Yi-Ping Hung, Li-Wei Chan, Seng Yong Lau, Sung-Sheng Tsai, I-Chun Hsiao, Dian-Je Tsai, Fang-I Hsiao, Lung-Pan Cheng, Mike Chen
2011 Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11  
TUIC simulates finger touches on capacitive displays using passive materials and active modulation circuits embedded inside tangible objects, and can be used with multi-touch gestures simultaneously.  ...  TUIC consists of three approaches to sense and track objects: spatial, frequency, and hybrid (spatial plus frequency).  ...  ACKNOWLEDGMENTS This work was supported in part by the Excellent Research Projects of National Taiwan University, under grants 99R80303,and by the National Science Council, Taiwan, under grant NSC 98-2221  ... 
doi:10.1145/1978942.1979386 dblp:conf/chi/YuCLTHTHCCHH11 fatcat:f5ugmrybgfaw3i3ist724yhjxy

Experimental analysis of touch-screen gesture designs in mobile environments

Andrew Bragdon, Eugene Nelson, Yang Li, Ken Hinckley
2011 Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11  
Direct-touch interaction on mobile phones revolves around screens that compete for visual attention with users" realworld tasks and activities.  ...  This paper investigates the impact of these situational impairments on touch-screen interaction.  ...  Manual Deskterity [15] uses a similar approach to create objects. This work forms the basis of the bezel moding technique presented in this paper.  ... 
doi:10.1145/1978942.1979000 dblp:conf/chi/BragdonNLH11 fatcat:wpitw4w45bhhthnqsaeyelqgdu

The Fundamental Issues of Pen-Based Interaction with Tablet Devices

Michelle K Annett
2014
The thesis also presents an in-depth exploration of unintended touch.  ...  The proposed Latency Perception iii Model has provided a cohesive understanding of touch-and pen-based latency perception, and a solid foundation upon which future explorations of latency can occur.  ...  techniques or domain-specific functionality, such as simultaneous or interleaved pen and touch input (Hinckley et al., 2010a,b) .  ... 
doi:10.7939/r3q23r65d fatcat:ejwd6snd2jcmbmblhtlitettsy

Multimodal Content Delivery for Geo-services

Keith Gardiner
2015
demonstrate novel forms of multimodal input.  ...  Information overload is an active concern for location-based applications that struggle to manage large amounts of data, contributions in the area of egocentric visibility that filter data based on field-of-view  ...  "Manual deskterity: an exploration of simultaneous pen + touch direct input."  ... 
doi:10.21427/d7688v fatcat:cgvezx6yhrcx7nnn6qarmh7lry