A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input
[article]
2016
arXiv
pre-print
In this paper, we propose a real-time solution that uses a single commodity RGB-D camera. ...
Real-time simultaneous tracking of hands manipulating and interacting with external objects has many potential applications in augmented reality, tangible computing, and wearable computing. ...
An ensemble of Collaborative Trackers (ECT) for RGB-D based multi-object and multiple hand tracking is used in [14] . Their accuracy is high, but runtime is far from real-time. ...
arXiv:1610.04889v1
fatcat:eobmqcu6abc6pjalqy4m3arsze
Real-Time Joint Tracking of a Hand Manipulating an Object from RGB-D Input
[chapter]
2016
Lecture Notes in Computer Science
In this paper, we propose a real-time solution that uses a single commodity RGB-D camera. ...
Real-time simultaneous tracking of hands manipulating and interacting with external objects has many potential applications in augmented reality, tangible computing, and wearable computing. ...
An ensemble of Collaborative Trackers (ECT) for RGB-D based multi-object and multiple hand tracking is used in [14] . Their accuracy is high, but runtime is far from real-time. ...
doi:10.1007/978-3-319-46475-6_19
fatcat:epro3on5rnhhjovw62to4bbsoq
Real-Time Hand Tracking Under Occlusion from an Egocentric RGB-D Sensor
2017
2017 IEEE International Conference on Computer Vision Workshops (ICCVW)
We present an approach for real-time, robust and accurate hand pose estimation from moving egocentric RGB-D cameras in cluttered real environments. ...
in real time. ...
A method for real-time joint tracking of hands and objects from third-person viewpoints was recently proposed [22] , but is limited to known objects and small occlusions. ...
doi:10.1109/iccvw.2017.82
dblp:conf/iccvw/MuellerMS0CT17
fatcat:dqdrwatsqfd5vixaxwzw3fr6lu
Real-Time Hand Tracking under Occlusion from an Egocentric RGB-D Sensor
2017
2017 IEEE International Conference on Computer Vision (ICCV)
We present an approach for real-time, robust and accurate hand pose estimation from moving egocentric RGB-D cameras in cluttered real environments. ...
in real time. ...
A method for real-time joint tracking of hands and objects from third-person viewpoints was recently proposed [27] , but is limited to known objects and small occlusions. ...
doi:10.1109/iccv.2017.131
dblp:conf/iccv/MuellerMS0CT17
fatcat:y6wyzugumnffveohczf54ole4u
Technically, we use a head-mounted depth camera to capture the RGB-D images from egocentric view, and adopt the random forest to regress for the palm pose and classify the hand gesture simultaneously via ...
The predicted pose and gesture are used to render the 3D virtual objects, which are overlaid onto the hand region in input RGB images with camera calibration parameters for seamless virtual and real scene ...
In this demo we present an AR system to allow users to manipulate virtual objects freely with their bare hands based on our joint palm pose tracking and gesture recognition algorithm, so that the virtual ...
doi:10.1145/2733373.2807972
dblp:conf/mm/LiangYTM15
fatcat:hbrgvwk3ffcg3nc2n3mmhcv424
GANerated Hands for Real-time 3D Hand Tracking from Monocular RGB
[article]
2017
arXiv
pre-print
We address the highly challenging problem of real-time 3D hand tracking based on a monocular RGB-only sequence. ...
We demonstrate that our hand tracking system outperforms the current state-of-the-art on challenging RGB-only footage. ...
[28] showed tracking of both the hand and a manipulated object using 8 calibrated cameras in a studio setup. Ballan et al. ...
arXiv:1712.01057v1
fatcat:7jxgkuoogfbb5itjoq2wbpftpa
GANerated Hands for Real-Time 3D Hand Tracking from Monocular RGB
2018
2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
Figure 1 : We present an approach for real-time 3D hand tracking from monocular RGB-only input. ...
Abstract We address the highly challenging problem of real-time 3D hand tracking based on a monocular RGB-only sequence. ...
[28] showed tracking of both the hand and a manipulated object using 8 calibrated cameras in a studio setup. Ballan et al. ...
doi:10.1109/cvpr.2018.00013
dblp:conf/cvpr/MuellerBSM0CT18
fatcat:pw73umrjgjhdzpttehh3uljeu4
Using a single RGB frame for real time 3D hand pose estimation in the wild
[article]
2017
arXiv
pre-print
We present a method for the real-time estimation of the full 3D pose of one or more human hands using a single commodity RGB camera. ...
More specifically, given an RGB image and the relevant camera calibration information, we employ a state-of-the-art detector to localize hands. ...
This approach enables real-time and robust tracking of the full 3D hand pose using conventional RGB input. ...
arXiv:1712.03866v1
fatcat:zwzeliwhkjgvlkv4taygbdxtom
HOI4D: A 4D Egocentric Dataset for Category-Level Human-Object Interaction
[article]
2022
arXiv
pre-print
HOI4D consists of 2.4M RGB-D egocentric video frames over 4000 sequences collected by 4 participants interacting with 800 different object instances from 16 categories over 610 different indoor rooms. ...
With HOI4D, we establish three benchmarking tasks to promote category-level HOI from 4D visual signals including semantic segmentation of 4D dynamic point cloud sequences, category-level object pose tracking ...
In terms of pose tracking, 6-PACK [42] tracks a small set of keypoints in RGB-D videos and estimates object pose by accumulating relative pose changes over time. ...
arXiv:2203.01577v3
fatcat:kkwisjhrkbgzfp764bt26hd2ra
Deep 6-DoF Tracking of Unknown Objects for Reactive Grasping
[article]
2021
arXiv
pre-print
Robotic manipulation of unknown objects is an important field of research. Practical applications occur in many real-world settings where robots need to interact with an unknown environment. ...
Our object tracking method combines Siamese Networks with an Iterative Closest Point approach for pointcloud registration into a method for 6-DoF unknown object tracking. ...
The method is suited as a real-time system for real-world robotic grasping of known objects. ...
arXiv:2103.05401v3
fatcat:g2nyvnmbzrdopocklfc4vavley
Vision-Based Intelligent Perceiving and Planning System of a 7-DoF Collaborative Robot
2021
Computational Intelligence and Neuroscience
In this paper, an intelligent perceiving and planning system based on deep learning is proposed for a collaborative robot consisting of a 7-DoF (7-degree-of-freedom) manipulator, a three-finger robot hand ...
A new trajectory planning method of the manipulator was proposed to improve efficiency. The performance of the IPPS was tested with simulations and experiments in a real environment. ...
When a human grasps an object, the operation is usually hard to track with plenty of movable joints. erefore, in this research, a new method to realize hand tracking was proposed to achieve finger joint ...
doi:10.1155/2021/5810371
pmid:34630547
pmcid:PMC8497130
fatcat:4kskk7glmresvonw7namb4pk7a
H2O: Two Hands Manipulating Objects for First Person Interaction Recognition
[article]
2021
arXiv
pre-print
We further propose the method to predict interaction classes by estimating the 3D pose of two hands and the 6D pose of the manipulated objects, jointly from RGB images. ...
Our dataset, called H2O (2 Hands and Objects), provides synchronized multi-view RGB-D images, interaction labels, object classes, ground-truth 3D poses for left & right hands, 6D object poses, ground-truth ...
While single hand manipulation is relevant for some scenarios, most of the time, hand-object interaction involves two hands manipulating an object. ...
arXiv:2104.11181v2
fatcat:6hjwuoctcfeytofsot5wsicpe4
A Survey on 3D Hand Skeleton and Pose Estimation by Convolutional Neural Network
2020
Advances in Science, Technology and Engineering Systems
In this paper, we surveyed studies in which Convolutional Neural Networks (CNNs) were used to estimate the 3D hand pose from data obtained from the cameras (e.g., RGB camera, depth(D) camera, RGB-D camera ...
The surveyed studies were divided based on the type of input data and publication time. ...
Acknowledgment: This research was funded by the elementary level topic of Hung Vuong University, Vietnam. The title is "Using the Lie algebra, Lie group to improve the skeleton hand presentation". ...
doi:10.25046/aj050418
fatcat:tzpjnmpwtjbh7m6ld3nucyvxia
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
[article]
2015
arXiv
pre-print
MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world ...
Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. ...
We use low-cost Kinect (RGB-D) devices to track multiple users and objects (positions and orientations) in real-time and use head-mounted displays (Oculus Rift DK2) for tracking head rotations and providing ...
arXiv:1512.02922v1
fatcat:7ybkvr7g2jgm7d2zd4n2mmjabe
Tracking objects with point clouds from vision and touch
2017
2017 IEEE International Conference on Robotics and Automation (ICRA)
We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. ...
of small objects by the robot's end effector. ...
TRACKING ALGORITHM Our tracking algorithm takes as input a continuous stream of RGB-D images from an off-the-shelf dense depth sensor, and depth images from the GelSight sensor. ...
doi:10.1109/icra.2017.7989460
dblp:conf/icra/IzattMAT17
fatcat:iy2cyakpuvcjnkfluthrvf6xdy
« Previous
Showing results 1 — 15 out of 3,475 results