Filters








997 Hits in 7.1 sec

Single-Shot is Enough: Panoramic Infrastructure Based Calibration of Multiple Cameras and 3D LiDARs [article]

Chuan Fang, Shuai Ding, Zilong Dong, Honghua Li, Siyu Zhu, Ping Tan
2021 arXiv   pre-print
In this paper, we propose a single-shot solution for calibrating extrinsic transformations among multiple cameras and 3D LiDARs.  ...  The calibration of multi-modal sensors is crucial for a system to properly function, but it remains tedious and impractical for mass production.  ...  The results show that the proposed method using only one single frame achieves comparable performance with Kalibr. 2) camera-LiDAR extrinsic calibration: The extrinsic calibration between a 3D LiDAR and  ... 
arXiv:2103.12941v2 fatcat:fu74gukkwvb2pizdlj3yoonuo4

Road is Enough! Extrinsic Calibration of Non-overlapping Stereo Camera and LiDAR using Road Information [article]

Jinyong Jeong, Lucas Y. Cho, Ayoung Kim
2019 arXiv   pre-print
This paper presents a framework for the targetless extrinsic calibration of stereo cameras and Light Detection and Ranging (LiDAR) sensors with a non-overlapping Field of View (FOV).  ...  In order to solve the extrinsic calibrations problem under such challenging configuration, the proposed solution exploits road markings as static and robust features among the various dynamic objects that  ...  (a) Pointcloud data from LiDAR (b) Projected pointcloud onto a stereo image using calibration result. Fig. 1 : 1 Fig. 1: Result of extrinsic calibration between a stereo camera and LiDARs.  ... 
arXiv:1902.10586v2 fatcat:lw2yxiszrneqli7mnjq3rspsyq

Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review

De Jong Yeong, Gustavo Velasco-Hernandez, John Barry, Joseph Walsh
2021 Sensors  
This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors  ...  Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented.  ...  Acknowledgments: This paper and research behind it would not have been possible without the support of the IMaR team in the Munster Technological University.  ... 
doi:10.3390/s21062140 pmid:33803889 pmcid:PMC8003231 fatcat:j52leqrvwnhu5brd7lxgozvwya

Extrinsic Calibration of a 3D-LIDAR and a Camera [article]

Subodh Mishra, Gaurav Pandey, Srikanth Saripalli
2020 arXiv   pre-print
This work presents an extrinsic parameter estimation algorithm between a 3D LIDAR and a Projective Camera using a marker-less planar target, by exploiting Planar Surface Point to Plane and Planar Edge  ...  The steps include, detection of the target and the edges of the target in LIDAR and Camera frames, matching the detected planes and lines across both the sensing modalities and finally solving a cost function  ...  [5] calibrate a 2D-LIDAR Camera system using a V-Shaped target and determine the extrinsic parameters by minimizing the distance between the 3D features projected on the image plane and the corresponding  ... 
arXiv:2003.01213v2 fatcat:mrivhoo355dgjaeg4opo2rhwx4

Experimental Evaluation of 3D-LIDAR Camera Extrinsic Calibration [article]

Subodh Mishra, Philip R. Osteen, Gaurav Pandey, Srikanth Saripalli
2020 arXiv   pre-print
Error (MLRE) and Factory Stereo Calibration Error.  ...  In this paper we perform an experimental comparison of three different target based 3D-LIDAR camera calibration algorithms.  ...  This has motivated research for estimation of extrinsic calibration between various sensors, such as 3D-LIDARs and cameras.  ... 
arXiv:2007.01959v1 fatcat:qkdya3ltvrhrhfsmfubckxpime

The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception

Alex Zihao Zhu, Dinesh Thakur, Tolga Ozaslan, Bernd Pfrommer, Vijay Kumar, Kostas Daniilidis
2018 IEEE Robotics and Automation Letters  
In addition, we utilize a combination of IMU, a rigidly mounted lidar system, indoor and outdoor motion capture and GPS to provide accurate pose and depth images for each camera at up to 100Hz.  ...  There has been a lot of recent interest and development in applying algorithms to use the events to perform a variety of 3D perception tasks, such as feature tracking, visual odometry, and stereo depth  ...  The camera intrinsics, stereo extrinsics, and camera-IMU extrinsics are calibrated using the Kalibr toolbox 3 [30] , [31] , [32] , the extrinsics between the left DAVIS camera and Velodyne lidar are  ... 
doi:10.1109/lra.2018.2800793 dblp:journals/ral/ZhuTOPKD18 fatcat:ygjwj66avngdxjlk7ljawa6ntm

Project AutoVision: Localization and 3D Scene Perception for an Autonomous Vehicle with a Multi-Camera System

Lionel Heng, Benjamin Choi, Zhaopeng Cui, Marcel Geppert, Sixing Hu, Benson Kuan, Peidong Liu, Rang Nguyen, Ye Chuan Yeo, Andreas Geiger, Gim Hee Lee, Marc Pollefeys (+1 others)
2019 2019 International Conference on Robotics and Automation (ICRA)  
Project AutoVision aims to develop localization and 3D scene perception capabilities for a self-driving vehicle.  ...  The sensor suite employs many cameras for both 360-degree coverage and accurate multi-view stereo; the use of low-cost cameras keeps the cost of this sensor suite to a minimum.  ...  and a GNSS/INS system, and extrinsic calibration between a LiDAR sensor and a calibrated multi-camera system.  ... 
doi:10.1109/icra.2019.8793949 dblp:conf/icra/HengCCGHKLNYGLP19 fatcat:d7wcoms3drawhkd7klpu2sivlq

Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups [article]

Jorge Beltrán, Carlos Guindel, Fernando García
2021 arXiv   pre-print
We present a method to calibrate the extrinsic parameters of any pair of sensors involving LiDARs, monocular or stereo cameras, of the same or different modalities.  ...  However, the effective use of information from different sources requires an accurate calibration between the sensors involved, which usually implies a tedious and burdensome process.  ...  In this step, we performed two different calibration procedures: monocular/LiDAR, involving one of the cameras of the stereo system and one of the LiDAR scanners, and LiDAR/LiDAR, between the two VLP-16  ... 
arXiv:2101.04431v1 fatcat:nqhu5xsbx5c5xaxvkieoueakma

DATA FUSION OF LIDAR INTO A REGION GROWING STEREO ALGORITHM

J. Veitch-Michaelis, J.-P. Muller, J. Storey, D. Walton, M. Foster
2015 The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences  
<br><br> A low-level data fusion method is shown, involving a scanning LIDAR system and a stereo camera pair.  ...  This method also enables simple calibration of stereo cameras without the need for targets and trivial coregistration between the stereo and LIDAR point clouds.  ...  Assuming a good stereo calibration, the LIDAR intrinsic and extrinsic parameters with respect to the stereo system may be determined using least squares minimisation.  ... 
doi:10.5194/isprsarchives-xl-4-w5-107-2015 fatcat:oxy5ajppezcsln7npxe7l5hf5a

LiDAR-Camera Calibration Using Line Correspondences

Zixuan Bai, Guang Jiang, Ailing Xu
2020 Sensors  
In this paper, we introduce a novel approach to estimate the extrinsic parameters between a LiDAR and a camera.  ...  Our method is based on line correspondences between the LiDAR point clouds and camera images. We solve the rotation matrix with 3D–2D infinity point pairs extracted from parallel lines.  ...  When using a stereo camera system, we can calibrate two pairs of extrinsic parameters between the LiDAR and two cameras separately, and then we can estimate the extrinsic parameters between the two cameras  ... 
doi:10.3390/s20216319 pmid:33167580 pmcid:PMC7664239 fatcat:eywmwqqy4zcddc4fkrj5yhc5ru

An Extrinsic Calibration Tool for Radar, Camera and Lidar

Joris Domhof, Kooij Julian F. P., Kooij Dariu M.
2019 2019 International Conference on Robotics and Automation (ICRA)  
We present a novel open-source tool for extrinsic calibration of radar, camera and lidar.  ...  Our results show that all configurations achieve good results for lidar to camera errors and that fully connected pose estimation shows the best performance for lidar to radar errors when more than five  ...  ACKNOWLEDGEMENT The work is supported by NWO TTW under the project STW#13434 Standardized Self-Diagnostic Sensing Systems for Highly Automated Driving.  ... 
doi:10.1109/icra.2019.8794186 dblp:conf/icra/DomhofKK19 fatcat:i22lpuepw5fu5be3xnab6lu7uu

Circular Targets for 3D Alignment of Video and Lidar Sensors

Vincent Fremont, Sergio A. Rodriguez F., Philippe Bonnifait
2012 Advanced Robotics  
This paper presents a novel approach for solving the problem of 3D alignment between video and lidar sensors.  ...  Circular calibration targets are used in order to make full use of the perception properties of both lidar and video cameras, which greatly simplifies the calibration task.  ...  We wish to emphasize that in the case of stereo video cameras, our circular target means that the 3D alignment between the lidar and the two cameras and the extrinsic calibration of the stereo rig can  ... 
doi:10.1080/01691864.2012.703235 fatcat:ewqoqh4abfbwxmssz7umz2naem

Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups [article]

Carlos Guindel, Jorge Beltrán, David Martín, Fernando García
2017 arXiv   pre-print
We present a method for extrinsic calibration of lidar-stereo camera pairs without user intervention.  ...  Sensor setups consisting of a combination of 3D range scanner lasers and stereo vision systems are becoming a popular choice for on-board perception systems in vehicles; however, the combined use of both  ...  ACKNOWLEDGEMENT Research supported by the Spanish Government through the CICYT projects (TRA2015-63708-R and TRA2016-78886-C3-1-R), and the Comunidad de Madrid through SEGVAUTO-TRIES (S2013/MIT-2713).  ... 
arXiv:1705.04085v3 fatcat:a6lcdq4pvra2hiyhy5un4cmm7q

Cross-calibration of push-broom 2D LIDARs and cameras in natural scenes

Ashley Napier, Peter Corke, Paul Newman
2013 2013 IEEE International Conference on Robotics and Automation  
A method of target-less extrinsic calibration, which required the user to specify several point correspondences between 3D range data and a camera image was presented in [8] .  ...  The laser reflectance image then becomes a function only of b T l , which is equivalent to c T l , the extrinsic calibration between the LIDAR and camera. C.  ... 
doi:10.1109/icra.2013.6631094 dblp:conf/icra/NapierCN13 fatcat:yesfde2p25g2lguv6paxx5mgbm

Accurate Fruit Localisation for Robotic Harvesting using High Resolution LiDAR-Camera Fusion [article]

Hanwen Kang, Xing Wang, Chao Chen
2022 arXiv   pre-print
Two SOTA extrinsic calibration methods, target-based and targetless-based, are applied and evaluated to obtain the accurate extrinsic matrix between the LiDAR and camera.  ...  With the extrinsic calibration, the point clouds and color images are fused to perform fruit localisation using a one-stage instance segmentation network.  ...  A recent target-based method, ASAC [15] , is used to calibrate the extrinsic matrix between the LiDAR and camera. to maximise the similarity between LiDAR points L P and a template checkerboard model  ... 
arXiv:2205.00404v1 fatcat:ffsboiil3reafklhjbjt3g2lxm
« Previous Showing results 1 — 15 out of 997 results