Webly Supervised Image Classification with Self-Contained Confidence

Jingkang Yang, Litong Feng, Weirong Chen, Xiaopeng Yan, Huabin Zheng , Ping Luo, Wayne Zhang ;


This paper focuses on webly supervised learning (WSL), where datasets are built by crawling samples from the Internet and adopting search queries directly as their web labels. Although WSL benefits from fast and low-cost data expansion, noisy web labels prevent models from reliable predictions. To mitigate this problem, self-labeled supervised loss $\mathcal{L}_s$ is utilized together with webly supervised loss $\mathcal{L}_w$ in recent works. $\mathcal{L}_s$ relies on machine labels predicted by the model itself. Since whether web labels or machine labels being correct is usually on a case-by-case basis for web samples, it is desirable to adjust balance between $\mathcal{L}_s$ and $\mathcal{L}_w$ at a sample-level. Inspired by DNNs' ability on confidence prediction, we introduce self-contained confidence (SCC) by adapting model uncertainty for WSL setting and use it to sample-wisely balance $\mathcal{L}_s$ and $\mathcal{L}_w$. Thus, a simple yet effective WSL framework is proposed. A series of SCC-friendly approaches are investigated, among which our proposed graph-enhanced mixup stands out as the most effective approach to provide high-quality confidence to boost our framework. The proposed WSL framework achieved state-of-the-art results on two large-scale WSL datasets of WebVision-1000 and Food101-N."

Related Material