Filters








1,683 Hits in 4.3 sec

A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks

Po-Chang Su, Ju Shen, Wanxin Xu, Sen-Ching Cheung, Ying Luo
2018 Sensors  
. † This paper is an extended version of our paper published in The extension of extrinsic calibration for wide-baseline RGB-D camera network.  ...  Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s18010235 pmid:29342968 pmcid:PMC5795566 fatcat:xz4zc2wjvjfczcocizmvdrfg7i

The Potential of Light Fields in Media Productions

Jonas Trottnow, Martin Alain, Aljosa Smolic, Trevor Canham, Olivier Vu-Thanh, Javier Vázquez-Corral, Marcelo Bertalmío, Simon Spielmann, Tobias Lange, Kelvin Chelli, Marek Solony, Pavel Smrz (+3 others)
2019 SIGGRAPH Asia 2019 Technical Briefs on - SA '19  
This document describes challenges during building and shooting with the light field camera array, as well as its potential and challenges for the post-production.  ...  A special light field camera was build by Saarland University [Herfet et al. 2018 ] and is first tested under production conditions in the test production "Unfolding" as part of the SAUCE project.  ...  Having a separate depth map for each camera is valuable for camera array light fields because the large camera separations mean they view different objects, both with respect to field of view and parallax  ... 
doi:10.1145/3355088.3365158 dblp:conf/siggrapha/TrottnowSLCSSZA19 fatcat:oq5yrmfvpbhotgvvl3smsxloxa

Targetless Calibration of a Lidar - Perspective Camera Pair

Levente Tamas, Zoltan Kato
2013 2013 IEEE International Conference on Computer Vision Workshops  
A novel method is proposed for the calibration of a camera -3D lidar pair without the use of any special calibration pattern or point correspondences.  ...  The calibration is solved as a 2D-3D registration problem using a minimum of one (for extrinsic) or two (for intrinsicextrinsic) planar regions visible in both cameras.  ...  The most interesting calibration experiment was performed with an IR camera which has a rather limited resolution and a narrow field of view.  ... 
doi:10.1109/iccvw.2013.92 dblp:conf/iccvw/TamasK13 fatcat:jvubwssiwrgsppsncdsyo2ppqi

Calibration of a network of Kinect sensors for robotic inspection over a large workspace

R. Macknojia, A. Chavez-Aragon, P. Payeur, R. Laganiere
2013 2013 IEEE Workshop on Robot Vision (WORV)  
The internal calibration of the sensor in between the color and depth measurement is also presented.  ...  This paper presents an approach for calibrating a network of Kinect devices used to guide robotic arms with rapidly acquired 3D models.  ...  The focal length of the IR camera is larger than that of the color camera, i.e. the color camera has a larger field of view.  ... 
doi:10.1109/worv.2013.6521936 fatcat:ab2v426eojcnzgbcplaxxokshy

Calibrate Multiple Consumer RGB-D Cameras for Low-Cost and Efficient 3D Indoor Mapping

Chi Chen, Bisheng Yang, Shuang Song, Mao Tian, Jianping Li, Wenxia Dai, Lina Fang
2018 Remote Sensing  
However, because of the narrow field of view (FOV), its collection efficiency and data coverages are lower than that of laser scanners.  ...  The calibration procedure Remote Sens. 2018, 10, 328 3 of 28 is two-fold: (1) Intrinsic calibration involving the geometry/depth calibration of a single RGB-D camera; and (2) Extrinsic calibration solving  ...  Camera tracking-based methods do not require an optical tracking system or a sufficient overlap in field of view.  ... 
doi:10.3390/rs10020328 fatcat:ewcwwqzjujbfbchoevcpprhhem

Dataset and Pipeline for Multi-view Light-Field Video

Neus Sabater, Guillaume Boisson, Benoit Vandame, Paul Kerbiriou, Frederic Babon, Matthieu Hog, Remy Gendrot, Tristan Langlois, Olivier Bureller, Arno Schubert, Valerie Allie
2017 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)  
Our pipeline includes algorithms such as geometric calibration, color homogenization, view pseudo-rectification and depth estimation.  ...  Some of such applications require a large parallax between the different views of the Light-Field, making the multi-view capture a better option than plenoptic cameras.  ...  So we have an approach more similar to [17] in which a method for calibrating large camera arrays is presented.  ... 
doi:10.1109/cvprw.2017.221 dblp:conf/cvpr/SabaterBVKBHGLB17 fatcat:brgy3lftqrc3pkxse3zwhik4ka

Extrinsic Calibration of Multiple RGB-D Cameras From Line Observations

Alejandro Perez-Yus, Eduardo Fernandez-Moral, Gonzalo Lopez-Nicolas, Jose J. Guerrero, Patrick Rives
2018 IEEE Robotics and Automation Letters  
This paper presents a novel method to estimate the relative poses between RGB and depth cameras without the requirement of an overlapping field of view, thus providing flexibility to calibrate a variety  ...  This calibration problem is relevant to robotic applications which can benefit of using several cameras to increase the field of view.  ...  EXTRINSIC CALIBRATION FROM LINE OBSERVATIONS In this section we address the problem of extrinsic calibration of a depth camera D and a color camera C.  ... 
doi:10.1109/lra.2017.2739104 dblp:journals/ral/Perez-YusFLGR18 fatcat:bixhdffdmnerln2idh4gx7l5oy

Surgical Structured Light for 3D minimally invasive surgical imaging

Austin Reiter, Alexandros Sigaras, Dennis Fowler, Peter K. Allen
2014 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems  
Surgeons perform minimally invasive surgery using an image delivered by a laparoscope and a camera system that provides a high definition 2D image, but this leaves the surgeon without 3D depth perception  ...  We demonstrate the accuracy of our SSL system using ex-vivo data on both a cylinder calibration object as well as various plastic organs. 978-1-4799-6933-3/14/$31.00 ©2014 IEEE  ...  Next, we apply the rigid-body stereo extrinsics recovered from the camera calibration to represent this 3D point with respect to the imaging camera.  ... 
doi:10.1109/iros.2014.6942722 dblp:conf/iros/ReiterSFA14 fatcat:yha6ts47mzawtnaqmfcmuejfpi

Rapid 3D Modeling and Parts Recognition on Automotive Vehicles Using a Network of RGB-D Sensors for Robot Guidance

Alberto Chávez-Aragón, Rizwan Macknojia, Pierre Payeur, Robert Laganière
2013 Journal of Sensors  
The work introduces a method to integrate raw streams from depth sensors in the task of 3D profiling and reconstruction and a methodology for the extrinsic calibration of a network of Kinect sensors.  ...  This paper presents an approach for the automatic detection and fast 3D profiling of lateral body panels of vehicles.  ...  STPGP 381229-09, as well as the collaboration of Scintrex Trace Corp.  ... 
doi:10.1155/2013/832963 fatcat:abonn2tn5fdf3iv7av5htc7i34

Motorcycles that See: Multifocal Stereo Vision Sensor for Advanced Safety Systems in Tilting Vehicles

Gustavo Gil, Giovanni Savino, Simone Piantini, Marco Pierini
2018 Sensors  
Our aim was to study a camera-based sensor for the application of preventive safety in tilting vehicles.  ...  Our promising results support the application of this type of sensors for advanced motorcycle safety applications.  ...  of the paper.  ... 
doi:10.3390/s18010295 pmid:29351267 pmcid:PMC5795592 fatcat:lwn47lgl2rflnlmxbailmlvhoa

A real-time coarse-to-fine multiview capture system for all-in-focus rendering on a light-field display

Fabio Marton, Enrico Gobbetti, Fabio Bettio, Jose Antonio Iglesias Guitian, Ruggero Pintus
2011 2011 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON)  
The capture component is an array of low-cost USB cameras connected to a single PC.  ...  For all-in-focus rendering, view-dependent depth is estimated on the GPU using a customized multiview space-sweeping approach based on fast Census-based area matching implemented in CUDA.  ...  A final CUDA kernel then samples the light field from this depth using a narrow aperture filter, which combines the colors of the 4 nearest cameras with a narrow Gaussian weight.  ... 
doi:10.1109/3dtv.2011.5877176 fatcat:puvqqueek5hmvd7tft7ugpe5qq

A Lightweight Leddar Optical Fusion Scanning System (FSS) for Canopy Foliage Monitoring

Zhouxin Xi, Christopher Hopkinson, Stewart B. Rood, Celeste Barnes, Fang Xu, David Pearce, Emily Jones
2019 Sensors  
This includes an entire framework of calibration and fusion algorithms utilizing Leddar depth measurements and image parallax information.  ...  robotics to recover hemispherical, colored point clouds.  ...  Acknowledgments: Zhouxin Xi would like to thank Laura Chasmer, Derek Peddle, and Craig Coburn from the University of Lethbridge; and Richard Fournier from the Université de Sherbrooke for the many invaluable  ... 
doi:10.3390/s19183943 pmid:31547362 pmcid:PMC6767693 fatcat:klkywaoa7jh7fabiua7jw7dzyy

Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups [article]

Jorge Beltrán, Carlos Guindel, Fernando García
2021 arXiv   pre-print
We present a method to calibrate the extrinsic parameters of any pair of sensors involving LiDARs, monocular or stereo cameras, of the same or different modalities.  ...  However, the effective use of information from different sources requires an accurate calibration between the sensors involved, which usually implies a tedious and burdensome process.  ...  As for the cameras, the narrow field of view exhibited by the XB3's cameras (43°) contrasts with the wide angle of the Basler.  ... 
arXiv:2101.04431v1 fatcat:nqhu5xsbx5c5xaxvkieoueakma

Single-Shot Intrinsic Calibration for Autonomous Driving Applications

Abraham Monrroy Cano, Jacob Lambert, Masato Edahiro, Shinpei Kato
2022 Sensors  
In this paper, we present a first-of-its-kind method to determine clear and repeatable guidelines for single-shot camera intrinsic calibration using multiple checkerboards.  ...  With these intervals defined, we generated thousands of multiple checkerboard poses and evaluated them using ground truth values, in order to obtain configurations that lead to accurate camera intrinsic  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s22052067 pmid:35271212 pmcid:PMC8915015 fatcat:anhbhquw5jcwbkmgdkui52qagu

OpenPTrack: Open source multi-camera calibration and people tracking for RGB-D camera networks

Matteo Munaro, Filippo Basso, Emanuele Menegatti
2016 Robotics and Autonomous Systems  
OpenPTrack is an open source software for multi-camera calibration and people tracking in RGB-D camera networks.  ...  Here we detail how a cascade of algorithms working on depth point clouds and color, infrared and disparity images is used to perform people detection from different types of sensors and in any indoor light  ...  Key collaborators include the University of Padova and Electroland. Portions of the work have been supported by the National Science Foundation (IIS-1323767).  ... 
doi:10.1016/j.robot.2015.10.004 fatcat:xb3dgwn7offhhjbvoc63frarai
« Previous Showing results 1 — 15 out of 1,683 results