Loading...
Thumbnail Image
Item

Dynamic Expansion Diffusion Learning for Lifelong Generative Modelling

Ye, Fei
Bors, Adrian G.
Zhang, Kun
Supervisor
Department
Machine Learning
Embargo End Date
Type
Conference proceeding
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
The diffusion model has lately been shown to achieve remarkable performances through its ability of generating high quality images. However, current diffusion model studies consider only learning from a single data distribution, resulting in catastrophic forgetting when attempting to learn new data. In this paper, we explore a more realistic learning scenario where training data is continuously acquired. We propose the Dynamic Expansion Diffusion Model (DEDM) for addressing catastrophic forgetting and data distribution shifts under Online Task-Free Continual Learning (OTFCL) paradigm. New diffusion components are added to a mixture model following the evaluation of a criterion which compares the probabilistic representation of the new data with the existing knowledge of the DEDM model. In addition, to maintain an optimal architecture, we propose a component discovery approach that ensures the diversity of knowledge while minimizing the total number of parameters in the DEDM. Furthermore, we show how the proposed DEDM can be implemented as a teacher module in a unified framework for representation learning. In this approach, knowledge distillation is proposed for training a student module aiming to compress the teacher's knowledge into the latent space of the student.
Citation
F. Ye, A. G. Bors, and K. Zhang, “Dynamic Expansion Diffusion Learning for Lifelong Generative Modelling”, AAAI, vol. 39, no. 21, pp. 22101-22109, Apr. 2025.
Source
Proceedings of the 39th AAAI Conference on Artificial Intelligence
Conference
Keywords
Subjects
Source
Publisher
Association for the Advancement of Artificial Intelligence
Full-text link