Nighttime Defogging Using High-Low Frequency Decomposition and Grayscale-Color Networks

Wending Yan, Robby T. Tan, Dengxin Dai ;

Abstract


In this paper, we address the problem of nighttime defogging from a single image. We propose a framework consisting of two main modules: grayscale and color modules. Given an RGB foggy nighttime image, our grayscale module takes the grayscale version of the image as input, and decomposes it into high and low frequency layers. The high frequency layer contains the scene texture information, which is less affected by fog; and, the low frequency layer contains the scene layout/structure information including fog and glow. Our grayscale module then enhances the visibility of the textures in the high frequency layers, and removes the presence of glow and fog in the low frequency layers. Having processed the high/low frequency information, it fuses the layers to obtain a grayscale defogged image. Our second module, the color module, takes the original RGB image as input. The module has similar operations to those of the grayscale module. However, to obtain fog-free high and low frequency information, the module is guided by the grayscale module. The reason of doing this is because grayscale images are less affected by multiple colors of atmospheric light, which are commonly present in nighttime scenes. Moreover, having the grayscale module allows us to have consistency losses between the outputs of the two modules, which is critical to our framework, since we do not have paired ground-truths for our real data. Our extensive experiments on real foggy nighttime images show the effectiveness of our method."

Related Material


[pdf]