A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2013; you can also visit the original URL.
The file type is application/pdf
.
Eliciting usable gestures for multi-display environments
2012
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces - ITS '12
Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms, body or even a device itself. These gestures are often used as part of interactions to move data between the various components of an MDE, which is a longstanding research problem. But designers,
doi:10.1145/2396636.2396643
dblp:conf/tabletop/SeyedBSMT12
fatcat:bngooyupxbg4nkt5ohm36esl5e