Weighted Guided Image Filtering and Haze Removal in Single Image

H. Geethu, S. Shamna, Jubilant J. Kizhakkethottam
2016 Procedia Technology - Elsevier  
Many applications in the fields of computational photography and image processing require smoothing techniques that can preserve edge well. The smoothing process usually decomposes an image to be filtered into two layers: a base layer formed by homogeneous regions with sharp edges and a detail layer which can be either noise. Local filtering-based edge preserving smoothing techniques suffer from halo artifacts. A weighted guided image filter (WGIF) is introduced by incorporating an edgeaware
more » ... ghting into an existing guided image filter (GIF) to address the problem. The WGIF is applied for single image detail enhancement, single image haze removal, and fusion of differently exposed images. Poor visibility degrades the perceptual image quality as well as the performance of the computer vision algorithms such as surveillance, object detection, tracking and segmentation. Poor visibility in bad weather such as fog, mist and haze caused by the water droplets present in the air. Due to the presence of fog, mist and haze light scattered in the atmosphere before it reaches the camera. Fog formation is due to attenuation and airlight. Attenuation reduces the contrast and airlight increases the whiteness in the scene. Proposed algorithm uses bilateral filter for the estimation of airlight and recover scene contrast. Qualitative and quantitative analysis demonstrate that proposed algorithm performs well in comparison with prior state of the art algorithms. Proposed algorithm is independent of the density of fog and does not require user intervention. It can handle color as well as gray images. Proposed algorithm has a wide application in tracking and navigation, consumer electronics and entertainment industries. Proposed weighted guided image filter algorithm improves the quality of an image and by the use of haze removal algorithm a high quality depth map can also be obtained.
doi:10.1016/j.protcy.2016.05.248 fatcat:fdqy27u5zngcpdplddycvwhs24