A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
RGB-D Indoor Plane-based 3D-Modeling using Autonomous Robot
2014
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
In this research work, we describe a system for visual odometry and 3D modeling using information from RGB-D sensor (Camera). ...
The visual odometry method estimates the relative pose of the consecutive RGB-D frames through feature extraction and matching techniques. ...
Fast odometry from vision (FOVIS) is used in Huang et al. (2011) for visual odometry method based on fast feature extraction across RGB image frames. ...
doi:10.5194/isprsarchives-xl-1-301-2014
fatcat:r55gpqjdv5af3cgeeau5p2lqpu
Visual Odometry and Mapping for Autonomous Flight Using an RGB-D Camera
[chapter]
2016
Springer Tracts in Advanced Robotics
In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. ...
RGB-D cameras provide both a color image and per-pixel depth estimates. ...
The RGB-D camera is mounted at the base of the vehicle, tilted slightly down.
Fig. 2 2 The input RGB-D data to the visual odometry algorithm alongside the detected feature matches. ...
doi:10.1007/978-3-319-29363-9_14
fatcat:rszxizbt2vh37oxjkdcz7hevbm
Visual Odometry and Mapping for Indoor Environments Using RGB-D Cameras
[chapter]
2015
Communications in Computer and Information Science
Aiming to compute 6DOF camera poses for robots in a fast and efficient way, a Visual Odometry system for RGB-D sensors is designed and proposed that allows real-time position estimation despite the fact ...
One major application that directly benefits from these sensors is Visual Odometry, a class of algorithms responsible to estimate the position and orientation of a moving agent at the same time that a ...
This work is supported by the Coordination for the Improvement of Higher Education Personnel (CAPES) and the Funding Agency for Studies and Projects (FINEP). ...
doi:10.1007/978-3-662-48134-9_2
fatcat:vicep3j22rakrkmyxffkhn5dfm
Experimental study of odometry estimation methods using RGB-D cameras
2014
2014 IEEE/RSJ International Conference on Intelligent Robots and Systems
In recent years, several RGB-D visual odometry methods which process data from the sensor in different ways have been proposed . ...
the RGB-D data and environment characteristics. ...
[17] integrates fovis and dense visual odometry with ICP for dense RGB-D mapping on a powerful CPU and GPU. ...
doi:10.1109/iros.2014.6942632
dblp:conf/iros/FangS14
fatcat:oftizz7aizaufph53dtn45mjze
Real-time onboard 6DoF localization of an indoor MAV in degraded visual environments using a RGB-D camera
2015
2015 IEEE International Conference on Robotics and Automation (ICRA)
This paper presents an onboard 6DoF pose estimation method for an indoor MAV in challenging GPS-denied degraded visual environments by using a RGB-D camera. ...
First, a fast and robust relative pose estimation (6DoF Odometry) method is proposed, which uses the range rate constraint equation and photometric error metric to get the frame-to-frame transform. ...
Zhang and G. Dubey for their help. ...
doi:10.1109/icra.2015.7139931
dblp:conf/icra/FangS15
fatcat:lyah63lclfesljk6rf6ujdo7za
Depth camera SLAM on a low-cost WiFi mapping robot
2012
2012 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)
It then uses RGB-D images for Simultaneous Localization and Mapping. ...
Our two-stage localization architecture first performs real-time obstacle-avoidance-based navigation and visual-based odometry correction for bearing angles. ...
ACKNOWLEDGMENT The authors wish to thank Harald Steck, Philip Whiting, Michael MacDonald, Detlef Hartmann, HeinzDieter Hettstedt and Hans-Peter Enderle for their very useful feedback on the data acquisition ...
doi:10.1109/tepra.2012.6215673
dblp:conf/tepra/MirowskiPH12
fatcat:sad2wrcvn5gwjo5jqov6dluvyy
Lightweight Visual Odometry for Autonomous Mobile Robots
2018
Sensors
This paper presents a low-overhead real-time ego-motion estimation (visual odometry) system based on either a stereo or RGB-D sensor. ...
maps common in full visual SLAM methods. ...
Our system supports RGB-D data as well, where features are detected from the RGB image and depth information is extracted from the depth image. ...
doi:10.3390/s18092837
pmid:30154311
fatcat:2bvhht2er5awdgx6cwtqomts5m
Fast localization and 3D mapping using an RGB-D sensor
2013
2013 16th International Conference on Advanced Robotics (ICAR)
A real-time approach combining a monocular visual odometry algorithm and range depth data is proposed in this paper. ...
Low-cost range sensors represent an interesting class of sensors which are increasingly used for localization and mapping purposes in robotics.The combination of depth data and visual information can be ...
A new scale factor estimation method using a combination of RGB monocular visual odometry and depth map has been proposed. ...
doi:10.1109/icar.2013.6766558
dblp:conf/icar/LoiannoLS13
fatcat:ephdc6mjcbeybaohuy4633crpu
Evaluation of the modern visual SLAM methods
2015
2015 Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference (AINL-ISMW FRUCT)
RGB-D dataset from TUM
Notes OpenRatSLAM was provided with data at 15Hz rate instead of 30Hz to make tracking robust. ...
Conclusion Several modern SLAM methods (L-, LSD-, ORB-and OpenRat-) were analyzed and evaluated with RGB-D dataset provided by TUM. ...
doi:10.1109/ainl-ismw-fruct.2015.7382963
fatcat:f4vqsz6gevajtipuxe7ontmapu
Direct Depth SLAM: Sparse Geometric Feature Enhanced Direct Depth SLAM System for Low-Texture Environments
2018
Sensors
This paper presents a real-time, robust and low-drift depth-only SLAM (simultaneous localization and mapping) method for depth cameras by utilizing both dense range flow and sparse geometry features from ...
We evaluate the performance of our method using benchmark datasets and real scene data. ...
Author Contributions: S.Z. contributed to the programing, experiments and writing of the manuscript. ...
doi:10.3390/s18103339
fatcat:sirtzyw2l5bjnfllni2fzxrzfq
Using Dense 3D Reconstruction for Visual Odometry Based on Structure from Motion Techniques
[chapter]
2016
Lecture Notes in Computer Science
Odometry provided by this work can be used to model a camera position and orientation from dense 3D reconstruction. AQ1 ...
Visual odometry is the process of estimating the position and orientation of an agent (a robot, for instance), based on images. ...
. • The second one uses the extracted data from the 3D map to perform a visual odometry in real time. ...
doi:10.1007/978-3-319-50832-0_47
fatcat:wczy5cpvkzbcrirnvi53o3agqe
Point Cloud Mapping Measurements Using Kinect RGB-D Sensor and Kinect Fusion for Visual Odometry
2016
Procedia Computer Science
RGB-D camera like Kinect make available RGB Images along with per-pixel depth information in real time. ...
This paper uses the Kinect Fusion developed by Microsoft Research for the 3D reconstruction of the scene in real time using the MicroKinect Camera and applies it as an aid for Visual Odometry of a Robotic ...
Taking advantage of this, the present paper proposes to use a RGB-D camera, readily available for visual odometry application. ...
doi:10.1016/j.procs.2016.06.044
fatcat:kiir2zpq7zfolbcopqgsgyka54
RGB-D SLAM Combining Visual Odometry and Extended Information Filter
2015
Sensors
Experimental validation is provided, which compares the proposed RGB-D SLAM algorithm with just RGB-D visual odometry and a graph-based RGB-D SLAM algorithm using the publicly-available RGB-D dataset. ...
In this paper, we present a novel RGB-D SLAM system based on visual odometry and an extended information filter, which does not require any other sensors or odometry. ...
The work in [31] uses RGB-D data to provide a complete benchmark for evaluating visual SLAM and odometry systems and proposes two evaluation metrics and automatic evaluation tools. Kerl et al. ...
doi:10.3390/s150818742
pmid:26263990
pmcid:PMC4570344
fatcat:7l5k5ibyezd63fny6kjrzan4wy
Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints
2021
Sensors
If a depth image is available, a window-based feature map is maintained to compute the RGB-D odometry, which is then fused with inertial outputs in an extended Kalman filter framework. ...
In depth drop conditions, only the partial 5-degrees-of-freedom pose information (attitude and position with an unknown scale) is available from the RGB-D sensor. ...
RGB-D images are processed in a local Kinect odometry module that utilizes a window-based map for real-time processing. 2D RGB images are used for directional motion constraints and rotation rate and fused ...
doi:10.3390/s21175913
pmid:34502806
fatcat:chlnewub5nhixc5p4tejak25ya
Efficient generation of 3D surfel maps using RGB–D sensors
2016
International Journal of Applied Mathematics and Computer Science
The article focuses on the problem of building dense 3D occupancy maps using commercial RGB-D sensors and the SLAM approach. ...
The proposed solution consists of two such key elements, visual odometry and surfel-based mapping, but it contains substantial improvements: storing the surfel maps in octree form and utilizing a frustum ...
The initial version of this work was presented at the special session on Robotic perception employing RGB-D images during the 13th National Conference on Robotics in Kudowa Zdrój, Poland, 2014. ...
doi:10.1515/amcs-2016-0007
fatcat:kernkxup55bchoeff2wtti5bea
« Previous
Showing results 1 — 15 out of 2,065 results