Road Scene Content Analysis for Driver Assistance and Autonomous Driving

Melih Altun, Mehmet Celenk
2017 IEEE transactions on intelligent transportation systems (Print)  
This research aims to develop a vision based driver assistance system that achieves scene awareness using video frames obtained from a dashboard camera. A saliency image map is formed with features pertinent to the driving scene. This saliency map, based on contour and motion sensitive human visual perception, is devised by extracting spatial, spectral and temporal information from the input frames and applying data fusion. Fusion output contains high level descriptors for segment boundaries
more » ... non-stationary objects. Following the segmentation and foreground object detection stage, an adaptive Bayesian learning framework classifies road surface regions and the detected foreground objects are tracked via Kalman filtering. In turn, this oversees potential collisions with the tracked objects. Furthermore, the vehicle path is used in conjunction with the extracted road information to detect deviations from the road surfaces. The system forms an augmented reality output in which video frames are context enhanced with the object tracking and road surface information. The proposed scene driven vision system improves the driver's situational awareness by enabling adaptive road surface classification, object tracking and collision estimation. As experimental results demonstrate, context aware low level to high level information fusion based on human vision model produces superior segmentation, tracking and classification results that lead to high level abstraction of driving scene. ACK OWLEDGEME TS This dissertation would not have been possible without the constant support and guidance of my advisor Dr. Mehmet Celenk. The knowledge, wisdom and motivation he granted me have been the biggest driving forces of this research. For that, I am deeply grateful. His mentorship has been a sine qua non for my doctoral study and I am honored to have him as my mentor. I also would like to express my gratitude for my
doi:10.1109/tits.2017.2688352 fatcat:mjp6rvksajbcrknnuesu2hhgnm