Filters








3,019 Hits in 8.8 sec

3D Depth Cameras in Vision: Benefits and Limitations of the Hardware [chapter]

Achuta Kadambi, Ayush Bhandari, Ramesh Raskar
2014 Advances in Computer Vision and Pattern Recognition  
The second-generation Microsoft Kinect uses time-of-flight technology, while the first-generation Kinect uses structured light technology.  ...  In this chapter, readers will find an overview of 3D camera technology and the artifacts that occur in depth maps.  ...  Comparison of First-and Second-Generation Kinect Models The first-generation Kinect is a structured light vision system, while the secondgeneration Kinect is a time-of-flight system. 3D Cameras of  ... 
doi:10.1007/978-3-319-08651-4_1 fatcat:gtuf2sgevzbzxf2biyzrk2zhti

Programmable Triangulation Light Curtains [chapter]

Jian Wang, Joseph Bartels, William Whittaker, Aswin C. Sankaranarayanan, Srinivasa G. Narasimhan
2018 Lecture Notes in Computer Science  
A vehicle on a road or a robot in the field does not need a full-featured 3D depth sensor to detect potential collisions or monitor its blind spot.  ...  We showcase the potential of light curtains using a range of real-world scenarios.  ...  A. C. Sankaranarayanan was supported in part by the NSF CAREER grant CCF-1652569. J. Bartels was supported by NASA fellowship NNX14AM53H.  ... 
doi:10.1007/978-3-030-01219-9_2 fatcat:7vaqdomvtzemlallwr2kqzkhfu

3D time-of-flight cameras for mobile robotics

Stefan May, Bjorn Werner, Hartmut Surmann, Kai Pervolz
2006 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems  
Recently developed 3D time-of-flight cameras have an enormous potential for mobile robotic applications in particular for mapping and navigation tasks.  ...  This paper presents a new approach for online adaptation of different camera parameters to environment dynamics.  ...  Today the most common techniques for 3D sensing are CCDor CMOS-camera, laser scanner or recently 3D time-of-flight camera based.  ... 
doi:10.1109/iros.2006.281670 dblp:conf/iros/MayWSP06 fatcat:nzzvilplingippom3hngcnp4w4

Computational 3D and reflectivity imaging with high photon efficiency

Dongeek Shin, Ahmed Kirmani, Vivek K Goyal, Jeffrey H. Shapiro
2014 2014 IEEE International Conference on Image Processing (ICIP)  
Capturing depth and reflectivity images at low light levels from active illumination of a scene has wide-ranging applications.  ...  Our computational imager combines physically accurate single-photon counting statistics with exploitation of the spatial correlations present in real-world reflectivity and 3D structure.  ...  Prior work: Active 3D imaging systems differ in how they modulate their transmitted power. Temporal modulation enables distance measurement by the time-of-flight (TOF) principle.  ... 
doi:10.1109/icip.2014.7025008 dblp:conf/icip/ShinKGS14 fatcat:gpqmtsqnlfgzzm5ynqrncxo5dy

Imaging atmospheric aerosol particles from a UAV with digital holography

Osku Kemppinen, Jesse C. Laning, Ryan D. Mersmann, Gorden Videen, Matthew J. Berg
2020 Scientific Reports  
The construction of HAPI consists of 3D printed polymer structures that enable a sufficiently low size and weight that it may be flown on a commercial-grade UAV.  ...  Using digital holography, the instrument obtains the images in a non-contact manner, resolving particles larger than ten micrometers in size in a sensing volume of approximately three cubic centimeters  ...  (a) Model showing the major components of the instrument, which include the electronics, sensing, and optics compartments.  ... 
doi:10.1038/s41598-020-72411-x pmid:32999324 fatcat:j7rm5yb2pfclbgpvb5wg3tqlpa

Non-line-of-sight imaging over 1.43 km

Cheng Wu, Jianjiang Liu, Xin Huang, Zheng-Ping Li, Chao Yu, Jun-Tian Ye, Jun Zhang, Qiang Zhang, Xiankang Dou, Vivek K Goyal, Feihu Xu, Jian-Wei Pan
2021 Proceedings of the National Academy of Sciences of the United States of America  
multiple times, typically using the information encoded in the time-of-flight of scattered photons.  ...  Together, these enable our demonstration of NLOS imaging and real-time tracking of hidden objects over a distance of 1.43 km.  ...  multiple times, typically using the information encoded in the time-of-flight of scattered photons.  ... 
doi:10.1073/pnas.2024468118 pmid:33658383 fatcat:e7z6xgapgzfd5isdojxi3zponm

PHOTOGRAMMETRIC TRACKING OF AERODYNAMIC SURFACES AND AEROSPACE MODELS AT NASA LANGLEY RESEARCH CENTER

Mark R. Shortis, Stuart Robson, Thomas W. Jones, William K. Goad, Charles B. Lunsford
2016 ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences  
Aerospace engineers require measurements of the shape of aerodynamic surfaces and the six degree of freedom (6DoF) position and orientation of aerospace models to analyse structural dynamics and aerodynamic  ...  System calibration and data processing techniques are discussed. Examples of experiments and data outputs are described.  ...  Figure 1 . 1 Example of the impact of ambient lighting on measurement of passive wing targets in a wind tunnel.  ... 
doi:10.5194/isprsannals-iii-5-27-2016 fatcat:qwctuz4dmncedifn5solidhkdu

PHOTOGRAMMETRIC TRACKING OF AERODYNAMIC SURFACES AND AEROSPACE MODELS AT NASA LANGLEY RESEARCH CENTER

Mark R. Shortis, Stuart Robson, Thomas W. Jones, William K. Goad, Charles B. Lunsford
2016 ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences  
Aerospace engineers require measurements of the shape of aerodynamic surfaces and the six degree of freedom (6DoF) position and orientation of aerospace models to analyse structural dynamics and aerodynamic  ...  System calibration and data processing techniques are discussed. Examples of experiments and data outputs are described.  ...  Figure 1 . 1 Example of the impact of ambient lighting on measurement of passive wing targets in a wind tunnel.  ... 
doi:10.5194/isprs-annals-iii-5-27-2016 fatcat:aortpdhg3bffbhp7mpbj73binu

Indoor and outdoor depth imaging of leaves with time-of-flight and stereo vision sensors: Analysis and comparison

Wajahat Kazmi, Sergi Foix, Guillem Alenyà, Hans Jørgen Andersen
2014 ISPRS journal of photogrammetry and remote sensing (Print)  
Time of Flight sensors are sensitive to ambient light and have low resolution but deliver high frame rate accurate depth data under suitable conditions.  ...  In this article we analyze the response of Time of Flight cameras (active sensors) for close range imaging under three different illumination conditions and compare the results with stereo vision (passive  ...  Acknowledgments This work is supported by the Danish Council for Strategic Research under project ASETA (www.aseta.dk) grant no. 09-067027, the Spanish Ministry of Science and Innovation under projects  ... 
doi:10.1016/j.isprsjprs.2013.11.012 fatcat:qivcrkzgjvgcrpubih3qvd5taa

Stick-slip dynamics in penetration experiments on simulated regolith [article]

Jack Featherstone and Robert Bullard and Tristan Emm and Anna Jackson and Riley Reid and Sean Shefferman and Adrienne Dove and Joshua Colwell and Jonathan E. Kollmer and Karen E. Daniels
2021 arXiv   pre-print
It employs a classic granular physics technique, photoelasticity, to quantify the dynamics of a flexible probe during its insertion into a system of bi-disperse, cm-sized model grains.  ...  We analyze the behavior of a flexible probe inserted into loose regolith simulant as a function of probe speed and ambient gravitational acceleration to explore the relevant dynamics.  ...  ACKNOWLEDGMENTS This work was funded by NASA Grant number 80NSSC18K0269 (REDDI) and National Science Foundation grant number DMR-2104986, along with with undergraduate student funding from both the NC State Office of  ... 
arXiv:2011.12890v3 fatcat:aov5fx7n4vharcrcjzwac3uire

Aircraft Observations of Dry Air, the ITCZ, Convective Cloud Systems, and Cold Pools in MJO during DYNAMO

Shuyi S. Chen, Brandon W. Kerns, Nick Guy, David P. Jorgensen, Julien Delanoë, Nicolas Viltard, Christopher J. Zappa, Falko Judt, Chia-Ying Lee, Ajda Savarin
2016 Bulletin of The American Meteorological Society - (BAMS)  
-13-00196_1.pdf by guest on 22 November 2020 and AWOT software packages courtesy of the Department of Energy ARM Climate Research facility and the NOAA National Severe Storms Laboratory, respectively.  ...  and Nick Hall for real-time forecasting support; and Ed Ryan for assisting with the cloud-cluster tracking during DYNAMO.  ...  It includes a total of six RCEs with two from each of the three days.  ... 
doi:10.1175/bams-d-13-00196.1 fatcat:wo7ld7dshff6rlwanxqrkzwiee

Signal Processing Opens New Views on Imaging [Special Reports]

John Edwards
2015 IEEE Signal Processing Magazine  
The "time-of-flight" camera resolves distance based on the known speed of light, measuring the time-of-flight of a light signal between the camera's sensor and the subject for each point of the image.  ...  "To construct our imaging system, we therefore had to hack into a time-of-flight development board, literally drilling into the sensor chip to combine it with a custom signal generator."  ... 
doi:10.1109/msp.2015.2437291 fatcat:tlffmjywszdolpdg6dym6doi74

Three-Dimensional Morphology and Size Measurement of High-Temperature Metal Components Based on Machine Vision Technology: A Review

Xin Wen, Jingpeng Wang, Guangyu Zhang, Lianqiang Niu
2021 Sensors  
Recently, machine vision technology has been developed, which is a non-contact measurement technology that only needs to capture multiple images of a measured object to obtain the 3D size and morphology  ...  The three-dimensional (3D) size and morphology of high-temperature metal components need to be measured in real time during manufacturing processes, such as forging and rolling.  ...  The combined filtering technology uses blue physical filters and digital image processing technology to suppress the R component.  ... 
doi:10.3390/s21144680 fatcat:rtncwtxlsrew3k3mk2jyi3cxyq

A 1.5Mpixel RGBZ CMOS image sensor for simultaneous color and range image capture

Wonjoo Kim, Wang Yibing, Ilia Ovsiannikov, SeungHoon Lee, Yoondong Park, Chilhee Chung, Eric Fossum
2012 2012 IEEE International Solid-State Circuits Conference  
A 1.5Mpixel RGBZ image sensor that simultaneously captures color (RGB) and time-of-flight (ToF) range (Z) images is presented.  ...  ToF range sensing has several advantages over stereo-imaging or motioncomputed 3D imaging, including single-lens implementation, less-intensive computing requirements, and freedom from occlusion artifacts  ...  ToF range sensing has several advantages over stereo-imaging or motioncomputed 3D imaging, including single-lens implementation, less-intensive computing requirements, and freedom from occlusion artifacts  ... 
doi:10.1109/isscc.2012.6177061 dblp:conf/isscc/KimWOLPCF12 fatcat:grhbvkbh2rgxflukj5yajgyrqy

Modeling and Analysis of a Direct Time-of-Flight Sensor Architecture for LiDAR Applications

Preethi Padmanabhan, Chao Zhang, Edoardo Charbon
2019 Sensors  
Direct time-of-flight (DTOF) is a prominent depth sensing method in light detection and ranging (LiDAR) applications.  ...  A flash LiDAR setup is simulated with typical operating conditions of a wide angle field-of-view (FOV = 40 ° ) in a 50 klux ambient light assumption.  ...  Depth sensing in LiDAR systems is a complex process due to a number of challenges. High background noise from ambient light is one of the primary challenges.  ... 
doi:10.3390/s19245464 pmid:31835807 pmcid:PMC6960641 fatcat:bugteztvzra6xcv7362lhhrtge
« Previous Showing results 1 — 15 out of 3,019 results