Probability-Guided Contrastive Learning for Long-Tailed Domain Generalization
Wang, Mengzhu ; Su, Houcheng ; Wang, Shanshan ; Yin, Nan ; Lan, Long ; Yang, Liang ; Shen, Li
Wang, Mengzhu
Su, Houcheng
Wang, Shanshan
Yin, Nan
Lan, Long
Yang, Liang
Shen, Li
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
After training on a specific source domain, models can leverage domain generalization (DG) techniques to achieve superior and broader performance on new, unseen target domains. Existing DG often utilizes contrastive learning to learn domain-invariant features. The goal of contrastive learning is to learn effective representations of data, causing samples from the same category to cluster together in feature space, while samples from different categories are dispersed. Traditional contrastive learning is limited to a finite set of contrastive pairs for DG. To handle this problem, we consider sampling from an infinite number of contrastive pairs using a mixture of von Mises-Fisher (vMF) distributions on the unit hypersphere. We propose a novel method called Probability-guided Contrastive Learning (PgCL), which selects contrastive pairs based on estimated data distributions of samples from each category in feature space. Additionally, we derive the exact analytical formula for the expected contrastive loss. We conduct an empirical investigation of the error bounds of PgCL and demonstrate its performance by comparing it with several leading methods across a range of DG datasets.
Citation
M. Wang et al., "Probability-Guided Contrastive Learning for Long-Tailed Domain Generalization," in IEEE Transactions on Big Data, doi: 10.1109/TBDATA.2025.3624965
Source
IEEE Transactions on Big Data
Conference
Keywords
Contrastive Pairs, Domain Generalization, Probability-Guided Contrastive Learning, von Mises-Fisher Distributions
Subjects
Source
Publisher
IEEE
