This article was accepted into the Self-supervised Learning: Theory and Practice workshop at NeurIPS 2023.
*=Equal taxpayers
Understanding model uncertainty is important for many applications. We propose Bootstrap Your Own Variance (BYOV), combining Bootstrap Your Own Latent (BYOL), a negative-free self-supervised learning (SSL) algorithm, with Bayes by Backprop (BBB), a Bayesian method for estimating model posteriors. We find that the learned predictive standard of BYOV against a supervised BBB model is well captured by a Gaussian distribution, providing preliminary evidence that the learned posterior parameter is useful for label-free uncertainty estimation. BYOV improves the deterministic BYOL baseline (+2.83% ECE test, +1.03% Brier test) and exhibits better calibration and reliability when tested at various magnifications (e.g.: +2.4% ECE test, +1.2% Brier test for salt and pepper noise).