Master of All: Simultaneous Generalization of Urban-Scene Segmentation to All Adverse Weather Conditions

Nikhil Reddy, Abhinav Singhal, Abhishek Kumar, Mahsa Baktashmotlagh, Chetan Arora ;

Abstract


"Computer vision systems for autonomous navigation must generalize well in adverse weather and illumination conditions expected in the real world. However, semantic segmentation of images captured in such conditions remains a challenging task for current state-of-the-art (\sota) methods trained on broad daylight images, due to the associated distribution shift. On the other hand, domain adaptation techniques developed for the purpose rely on the availability of the source data, (un)labeled target data and/or its auxiliary information (e.g., \gps). Even then, they typically adapt to a single(specific) target domain(s). To remedy this, we propose a novel, fully test time, adaptation technique, named \textit{Master of ALL} (\mall), for simultaneous generalization to multiple target domains. \mall learns to generalize on unseen adverse weather images from multiple target domains directly at the inference time. More specifically, given a pre-trained model and its parameters, \mall enforces edge consistency prior at the inference stage and updates the model based on (a) a single test sample at a time (\malls), or (b) continuously for the whole test domain (\malld). Not only the target data, \mall also does not need access to the source data and thus, can be used with any pre-trained model. Using a simple model pre-trained on daylight images, \mall outperforms specially designed adverse weather semantic segmentation methods, both in domain generalization and test-time adaptation settings. Our experiments on foggy, snow, night, cloudy, overcast, and rainy conditions demonstrate the target domain-agnostic effectiveness of our approach. We further show that \mall can improve the performance of a model on an adverse weather condition, even when the model is already pre-trained for the specific condition."

Related Material


[pdf] [supplementary material] [DOI]