Revisiting One-Versus-One and One-Versus-Rest: Insights into Imbalanced Multi-Class Classification
Chen, Kuan-Ting ; Lin, Chih-Jen
Chen, Kuan-Ting
Lin, Chih-Jen
Author
Supervisor
Department
Others
Embargo End Date
Type
Conference proceeding
Date
License
Language
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
One-versus-one (OVO) and one-versus-rest (OVR) are two widely adopted methods to decompose multi-class problems into several binary classification problems. It is well known that, in the case of kernel SVM, the two methods yield similar test accuracy. Thus, people generally assume that they differ mainly in training time and model size. However, our research reveals that if one considers an evaluation metric taking class imbalance into account, these two methods may give notable performance differences. To explore this phenomenon, we first conduct a detailed analysis of kernel SVM and then extend our study to neural networks. Additionally, we propose novel loss functions for neural networks that effectively integrate the OVO and OVR perspectives. Our experiments clearly demonstrate the robustness of OVO in handling imbalanced multi-class classification, highlighting its advantages over OVR in these challenging scenarios.
Citation
K.-T. Chen, C.-J. Lin, "Revisiting One-Versus-One and One-Versus-Rest: Insights into Imbalanced Multi-Class Classification," 2026, pp. 150-158.
Source
2025 IEEE International Conference on Data Mining (ICDM)
Conference
2025 IEEE International Conference on Data Mining (ICDM)
Keywords
46 Information and Computing Sciences, 4611 Machine Learning
Subjects
Source
2025 IEEE International Conference on Data Mining (ICDM)
Publisher
IEEE
