Radar and stereo vision fusion for multitarget tracking on the special Euclidean group

Josip Ćesić, Ivan Marković, Igor Cvišić, Ivan Petrović
2016 Robotics and Autonomous Systems  
Reliable scene analysis, under varying conditions, is an essential task in nearly any assistance or autonomous system application, and advanced driver assistance systems (ADAS) are no exception. ADAS commonly involve adaptive cruise control, collision avoidance, lane change assistance, traffic sign recognition, and parking assistance-with the ultimate goal of producing a fully autonomous vehicle. The present paper addresses detection and tracking of moving objects within the context of ADAS. We
more » ... use a multisensor setup consisting of a radar and a stereo camera mounted on top of a vehicle. We propose to model the sensors uncertainty in polar coordinates on Lie Groups and perform the objects state filtering on Lie groups, specifically, on the product of two special Euclidean groups, i.e., SE(2) 2 . To this end, we derive the designed filter within the framework of the extended Kalman filter on Lie groups. We assert that the proposed approach results with more accurate uncertainty modeling, since used sensors exhibit contrasting measurement uncertainty characteristics and the predicted target motions result with banana-shaped uncertainty contours. We believe that accurate uncertainty modeling is an important ADAS topic, especially when safety applications are concerned. To solve the multitarget tracking problem, we use the joint integrated probabilistic data association filter and present necessary modifications in order to use it on Lie groups. The proposed approach is tested on a real-world dataset collected with the described multisensor setup in urban traffic scenarios. (JosipĆesić, Ivan Marković, Igor Cvišić, Ivan Petrović) that there does not exist such a sensing system that could solely deliver full information required for adequate quality of ADAS applications [2] . Given that, ADAS commonly rely on using complementary sensing systems: vision, millimeter-wave radars, laser range finder (LRF) or combinations thereof. Radar units are able to produce accurate measurements of the relative speed and distance to the objects. LRF have higher lateral resolution than the radars and, besides accurate object distance, they can detect the occupancy area of an object and provide detailed scene representation [3] . Regarding the robustness, radar units are more robust to rain, fog, snow, and similar conditions that may cause inconveniences for LRF; but, they produce significant amount of clutter as a drawback. Vision-based sensing systems can also provide accurate lateral measurements and wealth of other information from images, thus provide an effective supplement to ranging-based sensor road scene analysis. As an example, a stereo vision sensor can provide target detection with high lateral resolution and less certain range, while usually bringing enough information for identification and classification of objects, whereas radar can provide accurate measurements of range and relative speed. Given the complementarity of radars and vision systems, this combination is commonly used in research for ADAS applications. For example, works based on a monocular camera use radar for finding regions of interest in the image [4] [5] [6] [7] , process separately image and radar data [8] [9] [10] , use motion stereo to reconstruct object boundaries [11, 12] ,
doi:10.1016/j.robot.2016.05.001 fatcat:fivxxn4fwjcl3dc5ikv6bh5ij4