Loading...
Convergence of the EM algorithm in KL distance for overspecified Gaussian mixtures
Legg, Alan ; Pak, Artur ; Melnykov, Igor ; Bolatov, Arman ; Assylbekov, Zhenisbek
Legg, Alan
Pak, Artur
Melnykov, Igor
Bolatov, Arman
Assylbekov, Zhenisbek
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
We present a study of the convergence properties of the Expectation-Maximization (EM) algorithm when applied to an overspecified model. In particular, we consider fitting a balanced mixture of two Gaussians to data originating from a single Gaussian. We provide theoretical bounds on the Kullback-Leibler (KL) divergence between the fitted and true distributions. An important feature is concavity and radiality of the expected log-likelihood function on a hypersurface induced by the EM algorithm, which greatly simplifies the analysis. We also show how our result on KL divergence can be used to upperbound the error rate of a mixture discriminant analysis classifier trained by the EM algorithm.
Citation
A. Legg, A. Pak, I. Melnykov, A. Bolatov, and Z. Assylbekov, “Convergence of the EM algorithm in KL distance for overspecified Gaussian mixtures,” Statistical Papers, vol. 66, no. 6, pp. 1–33, Oct. 2025, doi: 10.1007/S00362-025-01749-z
Source
Statistical Papers
Conference
Keywords
Subjects
Source
Publisher
Springer Nature
