A Non-Isotropic Probabilistic Take On Proxy-Based Deep Metric Learning
Michael Kirchhof, Karsten Roth, Zeynep Akata, Enkelejda Kasneci
"Proxy-based Deep Metric Learning (DML) learns deep metric spaces by embedding images and class representatives (proxies) close to one another during training, as commonly measured by the angle between them. However, this disregards the embedding norm, which can carry additional beneficial context such as class- or image-intrinsic uncertainty. In addition, proxy-based DML struggles to learn class-internal structures. To address both issues at once, we introduce non-isotropic probabilistic proxy-based DML. We model images as directional von Mises-Fisher (vMF) distributions on the hypersphere and motivate a non-isotropic von Mises-Fisher (nivMF) model for class proxies. This allows proxies to better represent complex class-specific variances in the embedding space. To cast these probabilistic models into proxy-to-image metrics, we further develop and investigate multiple distribution-to-point and distribution-to-distribution metrics. Each framework choice is motivated through a set of ablational studies, which showcase beneficial properties of our probabilistic approach to proxy-based DML. This comprises uncertainty-awareness, better behaved gradients during training and overall improved generalization performance. The latter is especially reflected in the competitive performance on the standard DML benchmarks. We find our proposed approach to compare favourably, suggesting that existing proxy-based DML can significantly benefit from a more probabilistic treatment."