Feature Tracking and Synchronous Scene Generation with a Single Camera
International Journal of Image Graphics and Signal Processing
This paper shows a method of tracking feature points to update camera pose and generating a synchronous map for AR (Augmented Reality) system. Firstly we select the ORB (Oriented FAST and Rotated BRIEF)  detection algorithm to detect the feature points which have depth information to be markers, and we use the LK (Lucas-Kanade) optical flow  algorithm to track four of them. Then we compute the rotation and translation of the moving camera by relationship matrix between 2D image coordinate
... and 3D world coordinate, and then we update the camera pose. Last we generate the map, and we draw some AR objects on it. If the feature points are missing, we can compute the same world coordinate as the one before missing to recover tracking by using new corresponding 2D/3D feature points and camera poses at that time. There are three novelties of this study: an improved ORB detection, which can obtain depth information, a rapid update of camera pose, and tracking recovery. Referring to the PTAM (Parallel Tracking and Mapping) , we also divide the process into two parallel sub-processes: Detecting and Tracking (including recovery when necessary) the feature points and updating the camera pose is one thread. Generating the map and drawing some objects is another thread. This parallel method can save time for the AR system and make the process work in real-time. missing by using the new 2D feature points and the camera pose at that time. Then we can continue to track and render AR objects at the specified location. More details will be shown in Section Ⅲ-E.