A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Research and Design of an Airfield Runway FOD Detection System Based on WSN
2013
International Journal of Distributed Sensor Networks
This paper introduces a designed method of airport runway FOD detection system based on WSN, analyzes function of various image acquisition sensors, and introduces FOD image analysis algorithm. ...
Finally WSN data fusion technology is used to analyze the FOD image. ...
So the line detection methods are basically based on this method at present. ...
doi:10.1155/2013/839586
fatcat:2wn4d7cvxvgonbqdwxg6c7jcti
Integrating Millimeter Wave Radar with a Monocular Vision Sensor for On-Road Obstacle Detection Applications
2011
Sensors
As a whole, a three-level fusion strategy based on visual attention mechanism and driver's visual consciousness is provided for MMW radar and monocular vision fusion so as to obtain better comprehensive ...
An adaptive thresholding algorithm based on a new understanding of shadows in the image is adopted for obstacle detection, and edge detection is used to assist in determining the boundary of obstacles. ...
Although there are already many researches on fusion radar and vision [11] [12] [13] [14] [15] [16] [17] [18] , this paper provides an approach for the fusion of radar and a monocular camera for on-road ...
doi:10.3390/s110908992
pmid:22164117
pmcid:PMC3231508
fatcat:qwcyiyjkm5eyjp6mdzi37g6ctm
Detection scheme for a partially occluded pedestrian based on occluded depth in lidar–radar sensor fusion
2017
Optical Engineering: The Journal of SPIE
Son, "Detection scheme for a partially occluded pedestrian based on occluded depth in lidar-radar sensor fusion," Opt. Eng. Abstract. ...
In the proposed method, the lidar and radar regions of interest (RoIs) have been selected based on the respective sensor measurement. ...
Edge-contour based reasoning is a method to infer an object by means of its edge and contour of an occluded object. ...
doi:10.1117/1.oe.56.11.113112
fatcat:z63ffardbnha3aoyqas4fjcqku
Vehicle and Guard Rail Detection Using Radar and Vision Data Fusion
2007
IEEE transactions on intelligent transportation systems (Print)
This paper describes a vehicle detection system fusing radar and vision data. Radar data are used to locate areas of interest on images. ...
Vehicle search in these areas is mainly based on vertical symmetry. ...
A fusion system driven by radar, such as the one proposed here, needs a very reliable radar because any radar miss cannot be recovered by vision. ...
doi:10.1109/tits.2006.888597
fatcat:nnazq3vcmfe7bgfrvia5cr4oxa
A Novel Multi-sensor Fusion Based Object Detection and Recognition Algorithm for Intelligent Assisted Driving
2021
IEEE Access
Applied in assisted driving, this paper focuses on the data fusion of camera and MMW Radar instead of LiDAR considering the cost issue and the limited resources of edge computing. ...
completed obstacle detection based on the fusion of MMW Radar and camera images data. ...
doi:10.1109/access.2021.3083503
fatcat:jjnbjucarnabxcr6bcx3vo2gmm
Review on Millimeter-Wave Radar and Camera Fusion Technology
2022
Sustainability
Cameras allow for highly accurate identification of targets. However, it is difficult to obtain spatial position and velocity information about a target by relying solely on images. ...
The data fusion algorithms from MMW radar and camera are described separately from traditional fusion algorithms and deep learning based algorithms, and their advantages and disadvantages are briefly evaluated ...
Based on this, Liu [60] proposed a singlestrain-based method for calibrating Millimetre-wave radar data and CCD camera data. The method does not require manual operation for calibration. Wang et al. ...
doi:10.3390/su14095114
fatcat:xozsw2auhjbpbccyf3flsfnvje
Review on Vehicle Detection Technology for Unmanned Ground Vehicles
2021
Sensors
Environmental perception technology is the foundation of UGVs, which is of great significance to achieve a safer and more efficient performance. ...
Secondly, related works about one of the most important aspects of environmental perception technology—vehicle detection—are reviewed and compared in detail in terms of different sensors. ...
Here, we only summarize recent sensor-fusion-based methods for vehicle detection including Radar-Vision fusion and Lidar-Vision fusion methods, for detailed architecture and methods about sensor fusion ...
doi:10.3390/s21041354
pmid:33672976
fatcat:ammlsccxbbhgpkx6r5vod7ciuy
Road and lane edge detection with multisensor fusion methods
1999
Proceedings 1999 International Conference on Image Processing (Cat. 99CH36348)
The multisensor fusion edge detection problem is posed in a Bayesian framework and a joint MAP estimate is employed to locate the road and lane boundaries. ...
This paper treats automated detection of road and lane boundaries by fusing information from forwardlooking optical and active W-band radar imaging sensors mounted on a motor vehicle. ...
We can see that the fusion method proposed in this paper outperforms the edge detection based on single sensors. ...
doi:10.1109/icip.1999.822983
dblp:conf/icip/MaLH99
fatcat:pgdnsi7nxjekbltwt6tdhsufo4
HODET: Hybrid Object DEtection and Tracking using mmWave Radar and Visual Sensors
[article]
2020
arXiv
pre-print
This paper proposes a novel Hybrid Object DEtection and Tracking (HODET) using mmWave Radar and Visual Sensors at the edge. ...
Edge computing technology has the potential to address a number of issues such as real-time processing requirements, off-loading of processing from congested servers, and size, weight, power, and cost ...
Another solution based on image and radar sensor fusion focused on pedestrian safety, which used sensor fusion at two levels (low and high) of their architecture. 34 A symmetrical deep convolutional ...
arXiv:2004.06861v1
fatcat:tng3xomea5cxhezqx2735ja7eq
Automatic Label Creation Framework for FMCW Radar Images Using Camera data
2021
IEEE Access
On the other hand, the camera data preprocessing is based on computer vision techniques, specifically DNN. ...
As a result, the complexity of the algorithm is higher than our proposed pipeline. Z. Ji et al. [17] proposed a method to locate and classify objects based on radar and camera sensors. ...
doi:10.1109/access.2021.3087207
fatcat:ywbj2au5fve3fm6a5nlkjqyr54
Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum Learning Approach
2021
IEEE Robotics and Automation Letters
Currently however, people detection systems used on drones are solely based on standard cameras besides an emerging number of works discussing the fusion of imaging and event-based cameras. ...
In order to enable the fusion of radars with both event-based and standard cameras, we present KUL-UAVSAFE, a first-of-its-kind dataset for the study of safety-critical people detection by drones. ...
Compared to our work, they use two distinct DNNs (one for each modality), each requiring GPU compute power, while we use a single CNN based on the SqueezeNet backbone [6] . ...
doi:10.1109/lra.2021.3125450
fatcat:z27e6jqrlnaedoezds4exewxey
Obstacle detection and classification fusing radar and vision
2008
2008 IEEE Intelligent Vehicles Symposium
The camera is mainly used to refine the vehicles' boundaries detected by the radar and to discard those who might be false positives; at the same time, a symmetry based pedestrian detection algorithm is ...
This paper presents a system whose aim is to detect and classify road obstacles, like pedestrians and vehicles, by fusing data coming from different sensors: a camera, a radar, and an inertial sensor. ...
It exploits a vehicle detection algorithm, based on fusion of camera images and radar data [1] , [2] , to detect vehicles, while a pedestrian detection algorithm [3] , [4] is exploited to detect the ...
doi:10.1109/ivs.2008.4621304
fatcat:4vjjs6jsbzfwhnikpi3dzaroti
The Improved A* Obstacle Avoidance Algorithm for the Plant Protection UAV with Millimeter Wave Radar and Monocular Camera Data Fusion
2021
Remote Sensing
, and inflection point optimization based on millimeter wave radar and monocular camera data fusion. ...
The results show that the maximum error in distance measurement of data fusion method was 8.2%. ...
point optimization based on the data fusion. ...
doi:10.3390/rs13173364
doaj:eaa90d46a9bf443f9029265a7bee8acb
fatcat:4mi4qgc525amrlrrzcekbtv3zq
A Survey on Sensor Technologies for Unmanned Ground Vehicles
[article]
2020
arXiv
pre-print
This paper proposes a brief review on sensor technologies for UGVs. Firstly, characteristics of various sensors are introduced. ...
Unmanned ground vehicles have a huge development potential in both civilian and military fields, and have become the focus of research in various countries. ...
Several researches among infrared camera have been put forward as followed: Some researchers established the method based on appearance or feature extracted from the image, specifically edge [125] , HOG ...
arXiv:2007.01992v2
fatcat:wprqjg7prngn3du4y3xqvfgckq
An obstacle detection method by fusion of radar and motion stereo
2002
IEEE transactions on intelligent transportation systems (Print)
An object detection method based on the motion stereo technique is proposed, which works by the fusion of millimeter wave radar and a single video camera. ...
The method does not depend on the appearance of the target object, so it is not only capable of detecting automobiles, but also other objects that come within its range. ...
Therefore, as a less expensive option, a combination of radar-and vision-based sensing using a single camera is promising. ...
doi:10.1109/tits.2002.802932
fatcat:a36kmxb7ljbrfhojja7aas62ia
« Previous
Showing results 1 — 15 out of 4,304 results