Item

Information-theoretic complementary prompts for improved continual text classification

Zhang, Duzhen
Ren, Yong
Li, Chenxing
Yu, Dong
Zhang, Tielin
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Continual Text Classification (CTC) aims to continuously classify new text data over time while minimizing catastrophic forgetting of previously acquired knowledge. However, existing methods often focus on task-specific knowledge, overlooking the importance of shared, task-agnostic knowledge. Inspired by the complementary learning systems theory, which posits that humans learn continually through the interaction of two systems — the hippocampus, responsible for forming distinct representations of specific experiences, and the neocortex, which extracts more general and transferable representations from past experiences — we introduce Information-Theoretic Complementary Prompts (InfoComp), a novel approach for CTC. InfoComp explicitly learns two distinct prompt spaces: P(rivate)-Prompt and S(hared)-Prompt. These respectively encode task-specific and task-invariant knowledge, enabling models to sequentially learn classification tasks without relying on data replay. To promote more informative prompt learning, InfoComp uses an information-theoretic framework that maximizes mutual information between different parameters (or encoded representations). Within this framework, we design two novel loss functions: (1) to strengthen the accumulation of task-specific knowledge in P-Prompt, effectively mitigating catastrophic forgetting, and (2) to enhance the retention of task-invariant knowledge in S-Prompt, improving forward knowledge transfer. Extensive experiments on diverse CTC benchmarks show that our approach outperforms previous state-of-the-art methods.
Citation
D. Zhang, Y. Ren, C. Li, D. Yu, and T. Zhang, “Information-theoretic complementary prompts for improved continual text classification,” Neural Networks, vol. 190, p. 107676, Oct. 2025, doi: 10.1016/J.NEUNET.2025.107676
Source
Neural Networks
Conference
Keywords
Continual learning, Text classification, Complementary learning systems, Prompt tuning, Information-theoretic framework
Subjects
Source
Publisher
Elsevier
Full-text link