Refining Self-Supervised Learning in Imaging: Beyond Linear Metric [article]

Bo Jiang, Hamid Krim, Tianfu Wu, Derya Cansever
2022 arXiv   pre-print
We introduce in this paper a new statistical perspective, exploiting the Jaccard similarity metric, as a measure-based metric to effectively invoke non-linear features in the loss of self-supervised contrastive learning. Specifically, our proposed metric may be interpreted as a dependence measure between two adapted projections learned from the so-called latent representations. This is in contrast to the cosine similarity measure in the conventional contrastive learning model, which accounts
more » ... correlation information. To the best of our knowledge, this effectively non-linearly fused information embedded in the Jaccard similarity, is novel to self-supervision learning with promising results. The proposed approach is compared to two state-of-the-art self-supervised contrastive learning methods on three image datasets. We not only demonstrate its amenable applicability in current ML problems, but also its improved performance and training efficiency.
arXiv:2202.12921v2 fatcat:gojx5muezjdybeyjd66exmwfly