Filters








837 Hits in 4.3 sec

Exploring Direct 3D Interaction for Full Horizontal Parallax Light Field Displays Using Leap Motion Controller

Vamsi Adhikarla, Jaka Sodnik, Peter Szolgay, Grega Jakus
2015 Sensors  
We propose an interaction setup combining the visualization of objects within the Field Of View (FOV) of a light field display and their selection through freehand gesture tracked by the Leap Motion Controller  ...  Each scene point is rendered individually resulting in more realistic and accurate 3D visualization compared to other 3D displaying technologies.  ...  Figure 10 . 10 Interaction with the light field display using Leap Motion Controller as finger tracking device.  ... 
doi:10.3390/s150408642 pmid:25875189 pmcid:PMC4431238 fatcat:4cmatp7txzc47algquu44ihm6m

A Review on Technical and Clinical Impact of Microsoft Kinect on Physical Therapy and Rehabilitation

Hossein Mousavi Hondori, Maryam Khademi
2014 Journal of Medical Engineering  
At last, to serve as technical comparison to help future rehabilitation design other sensors similar to Kinect are reviewed.  ...  Search results in Pubmed and Google scholar reveal increasing interest in using Kinect in medical application.  ...  Leap Motion Controller. Leap is another motion sensing device by Leap Motion. The main motivation behind building Table 3 : Nonclinically evaluated systems using Kinect.  ... 
doi:10.1155/2014/846514 pmid:27006935 pmcid:PMC4782741 fatcat:6bvvpfyitnb7dm4a555e4jy6qm

MamuLEDs: Mixed Reality meets Mamulengo

Jarbas Jácome, Maria Oliveira, Fernando Alvim, Veronica Teichrieb, Geber Ramalho
2020 Journal of Interactive Systems  
We used the method of qualitative case study, analyzing the audiovisual records of essays and presentation, the software solution developed in Unity 3D, and the data collected from a focus group interview  ...  Techniques of Virtual, Augmented, and Mixed Reality (VR, AR, MR) have been used for puppet theater in different cultural contexts around the world.  ...  The research "Unreal Interactive Puppet Game Development Using Leap Motion" uses Leap Motion as an input for a fighting game in which each hand tracked by the sensor controls a puppet (Huang, Huang, &  ... 
doi:10.5753/jis.2020.771 fatcat:fgkrjlw5abe6ravinc2fccdewm

Ontology-Based Interactive Animation/Game Generation for Chinese Shadow Play Preservation

Hui Liang, Shujie Deng, Jian Chang, Jian Jun Zhang, Can Chen, Ruofeng Tong
2016 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES)  
We propose a semantic framework using ontological methods to model the construction of interactive animation and promote integration process in a systematic and standardised way.  ...  As an essential tool for solving problems in many research areas, using standardised and structured terminologies, an ontological analysis could concisely describe the core logic of complex systems at  ...  In this diagram, Leap Motion is an instance of class "Motion Sensor" which is a subclass of "Device".  ... 
doi:10.1109/vs-games.2016.7590355 dblp:conf/vsgames/LiangDCZCT16a fatcat:on2bq5jzfvbdhlv2au45ylfnli

Ontology-Based Interactive Animation/Game Generation for Chinese Shadow Play Preservation

Hui Liang, Shujie Deng, Jian Chang, Jian Jun Zhang, Can Chen, Ruofeng Tong
2016 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES)  
We propose a semantic framework using ontological methods to model the construction of interactive animation and promote integration process in a systematic and standardised way.  ...  As an essential tool for solving problems in many research areas, using standardised and structured terminologies, an ontological analysis could concisely describe the core logic of complex systems at  ...  In this diagram, Leap Motion is an instance of class "Motion Sensor" which is a subclass of "Device".  ... 
doi:10.1109/vs-games.2016.7590354 dblp:conf/vsgames/LiangDCZCT16 fatcat:rbatgthtljgtxpv4mroz62hu5e

HaptiGlow: Helping Users Position their Hands for Better Mid-Air Gestures and Ultrasound Haptic Feedback

Euan Freeman, Dong-Bach Vo, Stephen Brewster
2019 2019 IEEE World Haptics Conference (WHC)  
If a user's hand is poorly placed, input sensors may have difficulty recognising their gestures. Mid-air haptic feedback is also hard to perceive when the hand is in a poor position.  ...  Our results show the combination of ultrasound haptics and peripheral visuals is effective, with the strengths of each leading to accurate (23mm) and fast (4.6s) guidance in a 3D targeting task.  ...  We calculate the Euclidean distance d between the centre of the palm (from the Leap Motion sensor) and the sweet spot.  ... 
doi:10.1109/whc.2019.8816092 dblp:conf/haptics/FreemanVB19 fatcat:uvxsxzeaabdkvh64hptuwrn6ci

Exploring the User Experience of Proxemic Hand and Pen Input Above and Aside a Drawing Screen

Ilhan Aslan, Björn Bittner, Florian Müller, Elisabeth André
2018 Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia - MUM 2018  
Motion sensor.  ...  The LeapArm 2.0 prototype is a 3D printed "robotic arm" with an inbuilt servomotor, which can be controlled to actuate the orientation of the embedded 3D controller (i.e., a Leap motion sensor).  ... 
doi:10.1145/3282894.3282906 dblp:conf/mum/AslanBMA18 fatcat:ojqwh5iktfdsvdlo2kdtqgcsfy

Case Study on Human-Robot Interaction of the Remote-Controlled Service Robot for Elderly and Disabled Care

Nayden Chivarov, Denis Chikurtev, Stefan Chivarov, Matus Pleva, Stanislav Ondas, Jozef Juhar, Kaloyan Yovchev
2019 Computing and informatics  
In this paper, we introduce a new version of the robot Robco 19, new leap motion sensor control of the robot and a new multi-channel control system.  ...  Gesture Control via Leap Motion Sensor Leap Motion is a stereo camera sensor and its main task is to recognize the human hand.  ...  Speech to text recognition is done by Google API and parsed using "word spotting". When the API returns the recognized text, we search it for the specific phrase using simple code in JavaScript.  ... 
doi:10.31577/cai_2019_5_1210 fatcat:ltawoqeiovhdbm7ob77fjyz3ui

A conceptual framework for motion based music applications

Marcella Mandanici, Antonio Roda, Sergio Canazza
2015 2015 IEEE 2nd VR Workshop on Sonic Interactions for Virtual Environments (SIVE)  
Imaginary projections are the core of the framework for motion based music applications presented in this paper.  ...  Their design depends on the space covered by the motion tracking device, but also on the musical feature involved in the application.  ...  The motion sensor Kinect 6 , placed in front of the user, provides the computer with the 3D polar coordinates of the two hands.  ... 
doi:10.1109/sive.2015.7361285 dblp:conf/vr/MandaniciRC15 fatcat:gmd6safnpraxrakrwzgs5vswwe

A Comprehensive Review of Sign Language Recognition: Different Types, Modalities, and Datasets [article]

Dr. M. Madhiarasan, Prof. Partha Pratim Roy
2022 arXiv   pre-print
Better classification accuracy achieved by 3D motion capture models than Microsoft Kinect and leap motion sensor-based model.Liu et al. [78] pointed out ST-Net (Spatial-Temporal Net) associated with self-boosted  ...  It widely used Kinect and Leap Motion Controller sensor for vision-based SLR. EMG devices produce better results for sensor-based SLR, and it is an ongoing research area.  ... 
arXiv:2204.03328v1 fatcat:72kb7zz5xfaqxa2l5sz22drrwi

PhotoTwinVR: An Immersive System for Manipulation, Inspection and Dimension Measurements of the 3D Photogrammetric Models of Real-Life Structures in Virtual Reality [article]

Slawomir Konrad Tadeja and Wojciech Rydlewicz and Yupu Lu and Per Ola Kristensson and Tomasz Bubas and Maciej Rydlewicz
2019 arXiv   pre-print
The system was populated with a 3D photogrammetric model of an existing pipe-rack generated using a commercial software package.  ...  Recent advancements in the development of Virtual Reality (VR) can help to unlock the full potential offered by the digital 3D-reality models generated using the state-of-art photogrammetric technologies  ...  Acknowledgements The authors would also extend the gratitude to the Bentley Systems Incorporated for allowing us to access the ContextCapture software used to generate the 3D reality meshes of the structure  ... 
arXiv:1911.09958v1 fatcat:25s4djmp3bellh43x765f34lpa

Space Connection: A New 3D Tele-immersion Platform for Web-Based Gesture-Collaborative Games and Services

Chun-Han Lin, Pei-Yu Sun, Fang Yu
2015 2015 IEEE/ACM 4th International Workshop on Games and Software Engineering  
To realize web-based 3D immersion techniques, we propose Space Connection that integrates techniques for virtual collaboration and motion sensing techniques with the aim of pushing motion sensing a step  ...  The 3D tele-immersion technique has brought a revolutionary change to human interaction-physically apart users can interact naturally with each other through body gesture in a shared 3D virtual environment  ...  INTRODUCTION The news that Apple acquires PrimeSense brings 3D motion sensing techniques to a hot spot.  ... 
doi:10.1109/gas.2015.12 dblp:conf/icse/LinSY15 fatcat:6lztpugxhjffti6q2c5td4rucu

A Review of Sign Language Recognition Techniques

2021 International Journal of Information Systems and Computer Sciences  
Research works based on hand gestures have adopted many different techniques, including those based on instrumented sensor technology and computer vision.  ...  In other words, the hand sign can be classified under many headings, such as posture and gesture, as well as dynamic and static, or a hybrid of the two.  ...  -To increase the recognition Use of Leap Motion, of gestures using Leap Approximate String Alonso et al [4] 2020  ... 
doi:10.30534/ijiscs/2021/011062021 fatcat:f3ah4kzronhs3it63typznipsq

Creativity Support and Multimodal Pen-based Interaction

Ilhan Aslan, Katharina Weitz, Ruben Schlagowski, Simon Flutura, Susana Garcia Valesco, Marius Pfeil, Elisabeth André
2019 2019 International Conference on Multimodal Interaction on - ICMI '19  
The multimodal solution uses microcontroller-technology to augment a digital pen with RGB LEDs and a Leap Motion sensor to enable bimanual input.  ...  The Leap Motion sensor recognizes the distance of the second hand to the leap motion and sends distance information to the drawing application.  ...  The second and final iteration of the pen prototype is depicted in Figure 2 as part of the complete interaction setup, which consisted of the pen, a Leap Motion sensor (i.e. a 3D sensor, which is able  ... 
doi:10.1145/3340555.3353738 dblp:conf/icmi/AslanWSFVPA19 fatcat:dfr625gwy5ho5kn24ynbhczgnm

The self-referenced DLR 3D-modeler

K. H. Strobl, E. Mair, T. Bodenmuller, S. Kielhofer, W. Sepp, M. Suppa, D. Burschka, G. Hirzinger
2009 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems  
The approach comprises an ego-motion algorithm tracking natural, distinctive features, concurrently with customary 3-D modeling of the scene.  ...  This novel development makes it possible to abandon using inconvenient, expensive external positioning systems.  ...  CONCURRENT 3-D MODELING AND EGO-MOTION ESTIMATION In Section I it was mentioned that the estimation of the pose of the 3D-Modeler from the information of its own sensors would signify a major improvement  ... 
doi:10.1109/iros.2009.5354708 dblp:conf/iros/StroblMBKSSBH09 fatcat:vm3pu4dnjbcqzp4cf27saybkhi
« Previous Showing results 1 — 15 out of 837 results