A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
Filters
Guest Editorial Special Issue on Visual SLAM
2008
IEEE Transactions on robotics
time and on increasingly large scales. ...
real-time operation. ...
doi:10.1109/tro.2008.2004620
fatcat:u6zxmc3awfbgnkgdghxoitkwni
Visual-Inertial Monocular SLAM With Map Reuse
2017
IEEE Robotics and Automation Letters
In this work we present a novel tightly-coupled Visual-Inertial Simultaneous Localization and Mapping system that is able to close loops and reuse its map to achieve zero-drift localization in already ...
We compare to the state-of-the-art in visual-inertial odometry in sequences with revisiting, proving the better accuracy of our method due to map reuse and no drift accumulation. ...
CONCLUSIONS We have presented in this paper a novel tightly coupled Visual-Inertial SLAM system, that is able to close loops in real-time and localize the sensor reusing the map in already mapped areas ...
doi:10.1109/lra.2017.2653359
dblp:journals/ral/Mur-ArtalT17
fatcat:5tv4e6wrifhftluf6ugypbrsj4
Maplab: An Open Framework for Research in Visual-Inertial Mapping and Localization
2018
IEEE Robotics and Automation Letters
On the one hand, maplab can be seen as a ready-to-use visual-inertial mapping and localization system. ...
Furthermore, it includes an online frontend that can create visual-inertial maps and also track a global drift-free pose within a localization map. ...
It can build new maps from raw visual and inertial sensor data and additionally track a global (drift-free) pose in real-time if a localization map is provided. ...
doi:10.1109/lra.2018.2800113
fatcat:cmdn7kx2zvawxo6wzpxcv22gqy
ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM
[article]
2020
arXiv
pre-print
The result is a system that operates robustly in real-time, in small and large, indoor and outdoor environments, and is 2 to 5 times more accurate than previous approaches. ...
This paper presents ORB-SLAM3, the first system able to perform visual, visual-inertial and multi-map SLAM with monocular, stereo and RGB-D cameras, using pin-hole and fisheye lens models. ...
The goal of Visual SLAM is to use the sensors on-board a mobile agent to build a map of the environment and compute in real-time the pose of the agent in that map. ...
arXiv:2007.11898v1
fatcat:pvo4rwqobbdpbj2idqke6somxe
A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching
[article]
2022
arXiv
pre-print
As an alternative, we revisit the assumed density formulation of Bayesian filtering and employ a moment matching (unscented Kalman filtering) approach to both visual-inertial odometry and visual SLAM. ...
The fusion of camera sensor and inertial data is a leading method for ego-motion tracking in autonomous and smart devices. ...
IV-B, considers replacing the time update in HybVIO [28] , a state-ofthe-art VIO model, with a UKF prediction step for improved robustness in real-world visual-inertial state estimation.
A. ...
arXiv:2205.13821v1
fatcat:x6u2ngybb5cpnj25qfwfxdf7zm
Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual–IMU–Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments
2022
Agronomy
Autonomous navigation in greenhouses requires agricultural robots to localize and generate a globally consistent map of surroundings in real-time. ...
In this study, a state-of-the-art real-time localization and mapping system was presented to achieve precise pose estimation and dense three-dimensional (3D) point cloud mapping in complex greenhouses ...
The practical range of the depth camera can reach up to 16 m [32] , so it is helpful to create a visual 3D map of large-scale greenhouse environments. ...
doi:10.3390/agronomy12081740
fatcat:2kaui4jbdfc3rbj3qezy3k42iq
REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
2015
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
However, to generate dense 3D maps of large scale environments is still very challenging. ...
Moreover, it is immune to tracking failures, and has smaller drift than the state-of-the-art systems in large scale reconstruction. ...
CONCLUSION We presented an extended KinectFusion system for real-time large scale 3D reconstruction by fusing Kinect and IMU data. ...
doi:10.5194/isprsannals-ii-3-w5-491-2015
fatcat:43i6m2ejffgpdkv5wivcthuwg4
Simultaneous localization and mapping for pedestrians using only foot-mounted inertial sensors
2009
Proceedings of the 11th international conference on Ubiquitous computing - Ubicomp '09
over time. ...
In this paper we describe a new Bayesian estimation approach for simultaneous mapping and localization for pedestrians based on odometry with foot mounted inertial sensors. ...
non-linear error growth in inertial integration over time. ...
doi:10.1145/1620545.1620560
dblp:conf/huc/RobertsonAK09
fatcat:yqunhfs2izcebkwnjtto7t6g6i
A dual-layer estimator architecture for long-term localization
2008
2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
The performance of the developed system is demonstrated in large-scale experiments, involving a vehicle localizing within an urban area. ...
In this paper, we present a localization algorithm for estimating the 3D position and orientation (pose) of a moving vehicle based on visual and inertial measurements. ...
These properties render the proposed architecture suitable for large-scale localization applications. ...
doi:10.1109/cvprw.2008.4563131
dblp:conf/cvpr/MourikisR08
fatcat:j344reawg5gwhhe2itj75u6rly
Accurate Monocular Visual-inertial SLAM using a Map-assisted EKF Approach
2019
IEEE Access
This paper presents a novel tightly coupled monocular visual-inertial simultaneous localization and mapping (SLAM) algorithm, which provides accurate and robust motion tracking at high frame rates on a ...
In a parallel thread, we construct a global map and perform a keyframe-based visual-inertial bundle adjustment to optimize the map. ...
An overview of the proposed monocular visual-inertial SLAM algorithm is shown in Fig. 1 . The system is complete and drift-free in large scale environments. ...
doi:10.1109/access.2019.2904512
fatcat:coe7svjcf5bn5psv3iehfacz34
Accurate Monocular Visual-inertial SLAM using a Map-assisted EKF Approach
[article]
2018
arXiv
pre-print
in real time on a standard CPU. ...
This paper presents a novel tightly-coupled monocular visual-inertial Simultaneous Localization and Mapping algorithm, which provides accurate and robust localization within the globally consistent map ...
Our monocular visual-inertial SLAM algorithm is shown in Fig.1 . The system is complete and drift-free in large scale environments. The remainder of the paper is organized as follows. ...
arXiv:1706.03648v3
fatcat:o5wiizclkvdkbdgjheaengkyy4
Stereo Visual Inertial LiDAR Simultaneous Localization and Mapping
[article]
2019
arXiv
pre-print
The system generates loop-closure corrected 6-DOF LiDAR poses in real-time and 1cm voxel dense maps near real-time. ...
VIL-SLAM accomplishes this by incorporating tightly-coupled stereo visual inertial odometry (VIO) with LiDAR mapping and LiDAR enhanced visual loop closure. ...
STEREO VISUAL INERTIAL ODOMETRY The goal of the stereo VIO is to provide real-time accurate state estimate at a relatively high frequency, serving as the motion model for the LiDAR mapping algorithm. ...
arXiv:1902.10741v1
fatcat:hhbnql4da5gvjodzklqbk5mlzu
PIVO: Probabilistic Inertial-Visual Odometry for Occlusion-Robust Navigation
[article]
2018
arXiv
pre-print
This paper presents a novel method for visual-inertial odometry. ...
Stronger coupling between the inertial and visual data sources leads to robustness against occlusion and feature-poor environments. ...
Perhaps one of the most promising approaches for precise real-time tracking is visual-inertial odometry, which is based on fusing measurements from inertial sensors (i.e. an accelerometer and gyroscope ...
arXiv:1708.00894v2
fatcat:gh6u4mtvojg3plcs55xferev7e
PIVO: Probabilistic Inertial-Visual Odometry for Occlusion-Robust Navigation
2018
2018 IEEE Winter Conference on Applications of Computer Vision (WACV)
This paper presents a novel method for visual-inertial odometry. ...
Stronger coupling between the inertial and visual data sources leads to robustness against occlusion and feature-poor environments. ...
Perhaps one of the most promising approaches for precise real-time tracking is visual-inertial odometry, which is based on fusing measurements from inertial sensors (i.e. an accelerometer and gyroscope ...
doi:10.1109/wacv.2018.00073
dblp:conf/wacv/SolinCRK18
fatcat:fnj4arux4va5zfnblf63tz2d6y
Planes, trains and automobiles — autonomy for the modern robot
2010
2010 IEEE International Conference on Robotics and Automation
We are concerned with enabling truly large scale autonomous navigation in typical human environments. ...
Over 181GB of image and inertial data are captured using headmounted stereo cameras. This data is processed into a relative map covering 121 km of Southern England. ...
While it is certainly possible to build large scale, consistent global world models (especially with the use of GPS), we find that there are numerous real world situations where it is impossible to do ...
doi:10.1109/robot.2010.5509527
dblp:conf/icra/SibleyMRN10
fatcat:447gzuc73beldfzxsgpsjfwl3e
« Previous
Showing results 1 — 15 out of 2,992 results