4,515 Hits in 4.2 sec

Preprint Extending Touch-less Interaction on Vision Based Wearable Device [article]

Zhihan Lv, Liangbing Feng, Shengzhong Feng, Haibo Li
2015 arXiv   pre-print
This is the preprint version of our paper on IEEE Virtual Reality Conference 2015. A touch-less interaction technology on vision based wearable device is designed and evaluated.  ...  Several proof-of-concept prototypes with eleven dynamic gestures are developed based on the touch-less interaction.  ...  A series of designed gestures and their evaluation related to touch-less interaction technology on vision based wearable device is proposed.  ... 
arXiv:1504.01025v2 fatcat:yzhbb3a5ubaqvddzyzov2lm7pe

Preprint Touch-less Interactive Augmented Reality Game on Vision Based Wearable Device [article]

Zhihan Lv, Alaa Halawani, Shengzhong Feng, Shafiq ur Rehman, Haibo Li
2015 arXiv   pre-print
In order to develop touch-less, interactive and augmented reality games on vision-based wearable device, a touch-less motion interaction technology is designed and evaluated in this work.  ...  Three primitive augmented reality games with eleven dynamic gestures are developed based on the proposed touch-less interaction technology as proof.  ...  The authors are thankful to Muhammad Sikandar Lal Khan for the preliminary hardware device support, to Liangbing Feng for kind help at SIAT and to our friends for their fruitful discussions and code-sharing  ... 
arXiv:1504.06359v5 fatcat:6g3elgcievbo5dzioctay5ayd4

Wearable Smartphone: Wearable Hybrid Framework for Hand and Foot Gesture Interaction on Smartphone

Zhihan Lv
2013 2013 IEEE International Conference on Computer Vision Workshops  
A novel smartphone wearable hybrid interaction framework based on mixed low-cost hardware and software is proposed in this work.  ...  The user study evaluation demonstrates the social acceptability of the designed hand/foot gestures and the usability of the applications on proposed wearable hybrid framework with touch-less interaction  ...  We think these metaphor will be well established for wearable hybrid framework based on touch-less interaction.  ... 
doi:10.1109/iccvw.2013.64 dblp:conf/iccvw/Lv13 fatcat:7qygqpxsprchpoo4e2ar6i4mfi

WUW - wear Ur world

Pranav Mistry, Pattie Maes, Liyan Chang
2009 Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09  
By using a tiny projector and a camera mounted on a hat or coupled in a pendant like wearable device, WUW sees what the user sees and visually augments surfaces or physical objects the user is interacting  ...  In this paper, we introduce WUW, a wearable gestural interface, which attempts to bring information out into the tangible world.  ...  Related Work Recently, there has been a great variety of multi-touch interaction based tabletop (e.g. [5, 6, 11, 13] ) and mobile device (e.g.  ... 
doi:10.1145/1520340.1520626 dblp:conf/chi/MistryMC09 fatcat:yisnjklybjaurhzj4qnalxa6ty

Mobile collocated interactions with wearables: past, present, and future

Andrés Lucero, James Clawson, Joel Fischer, Simon Robinson
2016 mUX The Journal of Mobile User Experience  
This Special Issue focuses on the emerging use of wearable technologies for mobile collocated interactions.  ...  We conclude by providing an overview of a series of workshops on the topic, and introduce the two main articles that comprise this Special Issue.  ...  Acknowledgement The field of mobile collocated interactions has greatly expanded since the beginnings of this workshop series, and we have been excited to follow and participate in new developments in  ... 
doi:10.1186/s13678-016-0008-x fatcat:hejazob2szezzhdowwxsnozosy


Srinath Sridhar, Anders Markussen, Antti Oulasvirta, Christian Theobalt, Sebastian Boring
2017 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI '17  
(c) It tracks the 3D position of fingertips, their identities, and touch on the BOH in real-time on consumer mobile devices.  ...  WatchSense uses a depth sensor embedded in a wearable device to expand the input space to neighboring areas of skin and the space above it.  ...  Laser-based range scanners [34, 33] as well as infrared sensors placed at the device's borders [3, 25, 32] are vision-based approaches to detect on-skin touch and gesture interaction around a device  ... 
doi:10.1145/3025453.3026005 dblp:conf/chi/0002MOTB17 fatcat:boh2houtp5gg5g43dqv4tluyke

Towards Augmented Reality-driven Human-City Interaction: Current Research on Mobile Headsets and Future Challenges [article]

Lik Hang Lee, Tristan Braud, Simo Hosio, Pan Hui
2021 arXiv   pre-print
This survey discusses 260 articles (68.8% of articles published between 2015 - 2019) to review the field of human interaction in connected cities with emphasis on augmented reality-driven interaction.  ...  We provide an overview of Human-City Interaction and related technological approaches, followed by a review of the latest trends of information visualization, constrained interfaces, and embodied interaction  ...  In [100] , a vision-based system detects the hand location and an IMU-driven ring device determines the touch event with virtual overlays in AR. Below are some examples of finger-to-arm interaction.  ... 
arXiv:2007.09207v2 fatcat:ymyrfxbvwfg2jfnweqxvmwl3t4

Opportunities and Challenges of Smartglass-Assisted Interactive Telementoring

Hyoseok Yoon
2021 Applied System Innovation  
Based on this analysis, we define what can be integrated into smartglass-enabled interactive telementoring.  ...  The widespread adoption of wearables, extended reality, and metaverses has accelerated the diverse configurations of remote collaboration and telementoring systems.  ...  For other domains, touch-based UIs for wearables are popular and considered default. Smartglasses such as Google Glass, HoloLens, and Vuzix Blade all provide on-device touch-based UIs.  ... 
doi:10.3390/asi4030056 fatcat:urcz3ii45fbxrf6273byvwek44

Current and future mobile and wearable device use by people with visual impairments

Hanlu Ye, Meethu Malu, Uran Oh, Leah Findlater
2014 Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI '14  
Participants with visual impairments also responded positively to two eyes-free wearable device scenarios: a wristband or ring and a glasses-based device.  ...  on the go and to participate in certain social interactions.  ...  The Navigation button was a pressure-based touch panel that used a 50mm touch potentiometer.  ... 
doi:10.1145/2556288.2557085 dblp:conf/chi/YeMOF14 fatcat:bn37ieztzja2dgfs4ncjemwdtu

Skin--The Next User Interface

Jurgen Steimle
2016 Computer  
But what's highly e ective on handheld devices is a challenge on wearables. To ensure users can comfortably wear devices, the trend has been to make wearables ever smaller.  ...  For this vision to become a reality, we need to solve three main problems. First, how can skin-based interactions be usable, useful, and socially acceptable?  ... 
doi:10.1109/mc.2016.93 fatcat:l7vmr7o6ljeinn2zaz2uwidp3q

Challenges in mobile multi-device ecosystems

Jens Grubert, Matthias Kranz, Aaron Quigley
2016 mUX The Journal of Mobile User Experience  
We base our findings on literature research and an expert survey. Specifically, we present grounded challenges relevant for the design, development and use of mobile multi-device environments.  ...  Personal and intimate mobile and wearable devices such as head-mounted displays, smartwatches, smartphones and tablets are rarely part of such multi-device ecosystems.  ...  However, while, for example, multi-display interaction is already common in stationary scenarios, to date we see less cross-device and multi-display interaction [6, 7] , which include mobile or wearable  ... 
doi:10.1186/s13678-016-0007-y fatcat:whsfd2wrrfg5tpq5qmwwxivcc4

Theme issue on mobile and pervasive games

Damianos Gavalas, Vlasios Kasapakis, Bin Guo
2015 Personal and Ubiquitous Computing  
The last paper, entitled ''Touch-less Interactive Augmented Reality Game on Vision Based Wearable Device'' (by Zhihan Lv, Alaa Halawani, Shengzhong Feng, Shafiq ur Réhman and Haibo Li), presents a touch-less  ...  A user study on the framework revealed that Google Glass was the preferable platform for touchless interaction and also that touch-less interfaces can serve as valid substitutes for present touch-based  ... 
doi:10.1007/s00779-015-0848-x fatcat:glb5wzqvdrdmbpezxnqt3hv3eq

A survey on haptic technologies for mobile augmented reality [article]

Carlos Bermejo, Pan Hui
2017 arXiv   pre-print
Due to the mobile capabilities of MAR applications, we mainly focus our study on wearable haptic devices for each category and their AR possibilities.  ...  This survey reviews current research issues in the area of human computer interaction for MAR and haptic devices.  ...  [74] present a novel device for mid-air tactile interaction based on air-jet approaches.  ... 
arXiv:1709.00698v3 fatcat:du5enq4r4jc6fbxw7jsudtajdu

A Wearable Personal Assistant for Surgeons – Design, Evaluation, and Future Prospects

Shahram Jalaliniya, Thomas Pederson, Diako Mardanbegi
2017 EAI Endorsed Transactions on Pervasive Health and Technology  
A prototype of the WPA was developed on Google Glass for supporting surgeons in three di↵erent scenarios: (1) touch-less interaction with medical images, (2) tele-presence during surgeries, and (3) mobile  ...  We evaluated the system in a clinical simulation facility and found that while the WPA can be a viable solution for touch-less interaction and remote collaborations during surgeries, using the WPA in the  ...  This reveals the challenge of using a head pointer on the HMD for touch-less interaction.  ... 
doi:10.4108/eai.7-9-2017.153066 fatcat:yl6hhgr76ffjrkaa43qugnxzpi

Imaginary interfaces

Sean Gustafson
2012 Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts - CHI EA '12  
With Imaginary Interfaces we re-enable spatial interaction on screenless devices. Users point and draw in the empty space in front of them or on the palm of their hands.  ...  Screenless mobile devices achieve maximum mobility, but at the expense of the visual feedback that is generally assumed to be necessary for spatial interaction.  ...  For instance, Skinput [5] with vibration-based touch sensing on the hand and forearm and Sixth Sense [9] with computer vision based sensing both use mobile projectors to provide visual feedback.  ... 
doi:10.1145/2212776.2212867 dblp:conf/chi/Gustafson12 fatcat:md32epu4svhefgc2ief6s7xspa
« Previous Showing results 1 — 15 out of 4,515 results