Filters








1,566 Hits in 5.2 sec

Applying bimanual interaction principles to text input on multi-touch surfaces and tabletops

Liam Don, Shamus P. Smith
2010 ACM International Conference on Interactive Tabletops and Surfaces - ITS '10  
Multi-touch surfaces and tabletops present new challenges and possibilities for text input.  ...  By basing designs on established theoretical models of bimanual interaction, it is possible to evaluate the best choice of bimanual technique for a novel form of text input.  ...  There is a noted distinction between symmetric actions (e.g. the common multi-touch scaling gesture) and asymmetric actions (e.g. scrolling with the left hand and selecting with the right) [2] .  ... 
doi:10.1145/1936652.1936702 dblp:conf/tabletop/DonS10 fatcat:y2kxewit7ne4pnwbxehwtpvucm

Occlusion-aware menu design for digital tabletops

Peter Brandl, Jakob Leitner, Thomas Seifried, Michael Haller, Bernard Doray, Paul To
2009 Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09  
In this paper, we describe the design of menus for multi-user digital tabletops. On direct input surfaces, occlusions created by the user's hand decrease interaction performance with menus.  ...  As an extension, we propose adding a gesture input area for fast interaction which can be partly occluded by the user's hand.  ...  He reports that the movement along the -top-left to bottom-right axis is fastest for left-handed users‖ and the mirrored movement along the -top-right to bottom-left axis for right-handed users‖.  ... 
doi:10.1145/1520340.1520461 dblp:conf/chi/BrandlLSHDT09 fatcat:epy3ineeezdnnj23ayyghefwre

Medusa

Michelle Annett, Tovi Grossman, Daniel Wigdor, George Fitzmaurice
2011 Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11  
We present Medusa, a proximity-aware multi-touch tabletop.  ...  Medusa uses 138 inexpensive proximity sensors to: detect a user's presence and location, determine body and arm locations, distinguish between the right and left arms, and map touch point to specific users  ...  Bimanual Input Distinction While bimanual input is commonly used in tabletop applications, Medusa is able to distinguish between the user's left and right hands.  ... 
doi:10.1145/2047196.2047240 dblp:conf/uist/AnnettGWF11 fatcat:h7quta4yrve6leuvqspqasbdhe

See me, see you

Hong Zhang, Xing-Dong Yang, Barrett Ens, Hai-Ning Liang, Pierre Boulanger, Pourang Irani
2012 Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems - CHI '12  
See Me, See You is a viable solution for providing simple yet effective support for multi-user application features on tabletops.  ...  We present See Me, See You, a lightweight approach for discriminating user touches on a vision-based tabletop.  ...  We also thank our lab mates for their valuable feedback and acknowledge NSERC for partially funding of this project.  ... 
doi:10.1145/2207676.2208392 dblp:conf/chi/ZhangYELBI12 fatcat:flfaignnjnd3rbnzppc7kwsmci

CThru

Hao Jiang, Alain Viel, Meekal Bajaj, Robert A. Lue, Chia Shen
2009 Proceedings of the 27th international conference on Human factors in computing systems - CHI 09  
We also discuss a pilot study and what it revealed with respect to CThru's interface and the usage pattern of the tabletop and the associated large wall display.  ...  in a multi-dimensional information space.  ...  ACKNOWLEDGEMENT We thank Matthew Bohan and Tom Torello for kindly generating the supplemental multimedia material.  ... 
doi:10.1145/1518701.1518887 dblp:conf/chi/JiangVBLS09 fatcat:ajdpmm3iyzfvrhnoakputtmske

Counting on your fingertips

Rama Vennelakanti, Anbumani Subramanian, Sriganesh Madhvanath, Sriram Subramanian
2011 Proceedings of the 3rd International Conference on Human Computer Interaction - IndiaHCI '11  
Although multi-touch technology and horizontal interactive surfaces have been around for a decade now, there is limited understanding of how users use the Rich Touch space and multiple fingers to manipulate  ...  Our investigation shows that user interactions can be described in terms of a small set of actions, and there are insightful ways in which hands are used, and number of finger used to carry out these actions  ...  In order to explore how this division of labour happens in tabletop tasks, we studied the probability of joint occurrence of actions performed using the left and right hands, and the results for the touch  ... 
doi:10.1145/2407796.2407800 fatcat:44zumnrkuze55k3dk44rnnfvcm

Interaction with stereoscopic data on and above multi-touch surfaces

Florian Daiber
2011 Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces - ITS '11  
With the increasing distribution of multi-touch capable devices multi-touch interaction becomes more and more ubiquitous.  ...  This research will evaluate multi-touch and gestural 3D interaction on and above interactive surfaces and explore the design space of interaction with stereoscopic data.  ...  Reisman et al. proposed interactions for direct manipulation of 3D objects through rotation, scale and translation [16] and Hancock introduced force-based 3D interactions for multi-touch tabletops [  ... 
doi:10.1145/2076354.2076428 dblp:conf/tabletop/Daiber11 fatcat:uuo2ua7m4nakpn5yctrszbfgla

Usage and Recognition of Finger Orientation for Multi-Touch Tabletop Interaction [chapter]

Chi Tai Dang, Elisabeth André
2011 Lecture Notes in Computer Science  
Afterwards, we present a simple and fast approach to detect the finger orientation reliably for multi-touch tabletop interaction.  ...  Further, recognition rates on real data gained from the camera within a multi-touch tabletop are presented in order to give a measure for the precision and reliability of the presented approach.  ...  tabletop surface, for example Figure 3 on the left-hand side.  ... 
doi:10.1007/978-3-642-23765-2_28 fatcat:ev5gu3ktljcgnmm32cj4u4uawq

HandyWidgets

Takuto Yoshikawa, Buntarou Shizuki, Jiro Tanaka
2012 Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces - ITS '12  
Large multi-touch tabletops are useful for collocated collaborative work involving multiple users.  ...  To solve these problems, we present HandyWidgets that are widgets localized around users' hands. The widgets are quickly invoked by a bimanual multi-touch gesture which we call "pull-out".  ...  INTRODUCTION Large multi-touch tabletops are useful for collocated collaborative work involving multiple users. Users surround tabletops extend their hands and touch the screen at their location.  ... 
doi:10.1145/2396636.2396667 dblp:conf/tabletop/YoshikawaST12 fatcat:a25ehhazefh3dlp5slxmvnw7pi

Touching the Void Revisited: Analyses of Touch Behavior on and above Tabletop Surfaces [chapter]

Gerd Bruder, Frank Steinicke, Wolfgang Stuerzlinger
2013 Lecture Notes in Computer Science  
Finally, we discuss implications for the development of future touch-sensitive interfaces with stereoscopic display.  ...  Recent developments in touch and display technologies made it possible to integrate touch-sensitive surfaces into stereoscopic three-dimensional (3D) displays.  ...  This work was partly supported by grants from the Deutsche Forschungsgemeinschaft and the Natural Sciences and Engineering Research Council of Canada.  ... 
doi:10.1007/978-3-642-40483-2_19 fatcat:rofilffynbbfhee7pinm7j6oke

Semiautomatic and User-Centered Orientation of Digital Artifacts on Multi-touch Tabletops [chapter]

Lorenz Barnkow, Kai von Luck
2012 Lecture Notes in Computer Science  
Similarly, the roles of orientation have to be taken into account when implementing software for multi-touch tabletops.  ...  The orientation of objects on tables is of fundamental importance for the coordination, communication and proper understanding of content in group work.  ...  Methods for the orientation of artifacts on multi-touch tabletops can be divided into three categories: manual, automatic and combined.  ... 
doi:10.1007/978-3-642-33542-6_34 fatcat:2w7mm6ym6jgizilr4e3gefidzi

The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface [chapter]

Nicolai Marquardt, Ricardo Jota, Saul Greenberg, Joaquim A. Jorge
2011 Lecture Notes in Computer Science  
Most interactions fall into one of two modalities: 1) direct touch and multi-touch (by hand and by tangibles) directly on the surface, and 2) hand gestures above the surface.  ...  For example, with our Extended Continuous Gestures category, a person can start an interaction with a direct touch and drag, then naturally lift off the surface and continue their drag with a hand gesture  ...  Ricardo Jota was supported by the Portuguese Foundation for Science and Technology (SFRH/BD/ 17574/2004).  ... 
doi:10.1007/978-3-642-23765-2_32 fatcat:mwiz2xzsufhohlcbla5mazehwa

Supporting Atomic User Actions on the Table [chapter]

Dzmitry Aliakseyeu, Sriram Subramanian, Jason Alexander
2010 Tabletops - Horizontal Interactive Displays  
User actions, such as pointing, selecting, scrolling and menu navigation, are often taken for granted in desktop GUI interactions, but have no equivalent interaction techniques in tabletop systems.  ...  In this chapter we present a review of the stateof-the-art in interaction techniques for selecting, pointing, rotating, and scrolling.  ...  Fig. 10. 1 1 Two state model as applicable in basic interactive tabletop systems (left) and three state model as applicable in various stylus based systems [16] , such as from Wacom © (right)Fig. 10.2  ... 
doi:10.1007/978-1-84996-113-4_10 dblp:series/hci/AliakseyeuSA10 fatcat:cmy2c3ikijbblkwyyikii2ghsu

Fighting for control

Paul Marshall, Rowanne Fleck, Amanda Harris, Jochen Rick, Eva Hornecker, Yvonne Rogers, Nicola Yuill, Nick Sheep Dalton
2009 Proceedings of the 27th international conference on Human factors in computing systems - CHI 09  
Tabletop and tangible interfaces are often described in terms of their support for shared access to digital resources.  ...  We discuss how children fight for and maintain control of physical versus digital objects in terms of embodied interaction and what this means when designing collaborative applications for shareable interfaces  ...  We're very grateful to Alissa Antle and the reviewers of this and an earlier Tabletop submission for suggested improvements. This work was supported by the EPSRC ShareIT Project grant EP/F017324/1.  ... 
doi:10.1145/1518701.1519027 dblp:conf/chi/MarshallFHRHRYD09 fatcat:52eczycoovhlzbgb4sqnxyhueq

Affordances for manipulation of physical versus digital media on interactive surfaces

Lucia Terrenghi, David Kirk, Abigail Sellen, Shahram Izadi
2007 Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '07  
Participants carried out both a puzzle task and a photo sorting task in two different modes: in a physical 3dimensional space and on a multi-touch, interactive tabletop in which the digital items resembled  ...  By observing the interaction behaviors of 12 participants, we explore the main differences and discuss what this means for designing interactive surfaces which use aspects of the physical world as a design  ...  ACKNOWLEDGMENTS Thanks to all the people who participated in the study and gave freely of their time and effort.  ... 
doi:10.1145/1240624.1240799 dblp:conf/chi/TerrenghiKSI07 fatcat:ryz2capvdvglde3bseas3n4jsy
« Previous Showing results 1 — 15 out of 1,566 results