Fusion of Vision Inertial Data for Automatic Georeferencing
Knowledge Discovery from Sensor Data
Intermittent loss of the gps signal is a common problem encountered in intelligent land navigation based on gps integrated inertial systems. This issue emphasizes the need for an alternative technology that would ensure smooth and reliable inertial navigation during gps outages. This paper presents the results of an effort where data from vision and inertial sensors are integrated. However, for such integration one has to first obtain the necessary navigation parameters from the available
... the available sensors. Due to the variety in the measurements, separate approaches have to be utilized in estimating the navigation parameters. Information from a sequence of images captured by a monocular camera attached to a survey vehicle at a maximum frequency of 3 frames per second was used in upgrading the inertial system installed in the same vehicle for its inherent error accumulation. Specifically, the rotations and translations estimated from point correspondences tracked through a sequence of images were used in the integration. Also a pre-filter is utilized to smoothen out the noise associated with the vision sensor (camera) measurements. Finally, the position locations based on the vision sensor are integrated with the inertial system in a decentralized format using a kalman filter. The vision/inertial integrated position estimates are successfully compared with those from inertial/gps system output this successful comparison demonstrates that vision can be used successfully to supplement the inertial measurements during potential gps-outages.