Calibration of hand-held camera sequences for plenoptic modeling
Proceedings of the Seventh IEEE International Conference on Computer Vision
In this contribution we focus on the calibration of very long image sequences from a hand-held camera that samples the viewing sphere of a scene. View sphere sampling is important for plenoptic (image-based) modeling that captures the appearance of a scene by storing images from all possible directions. The plenoptic approach is appealing since it allows in principle fast scene rendering of scenes with complex geometry and surface reflections, without the need for an explicit geometrical scene
... geometrical scene model. However the acquired images have to be calibrated, and current approaches mostly use pre-calibrated acquisition systems. This limits the generality of the approach. We propose a way out by using an uncalibrated handheld camera only. The image sequence is acquired by simply waving the camera around the scene objects, creating a zigzag scan path over the viewing sphere. We extend the sequential camera tracking of an existing structure-frommotion approach to the calibration of a mesh of viewpoints. Novel views are generated by piecewise mapping and interpolating the new image from the nearest viewpoints according to the viewpoint mesh. Local depth map estimates enhance the rendering process. Extensive experiments with ground truth data and hand-held sequences confirm the performance of our approach. metric approximation and image-based rendering conclude this contribution.