Rethinking Transformer-Based Multi-Document Summarization: An Empirical Investigation
Ma, Congbo ; Zhang, Wei ; Pitawela, Dileepa ; Zhuang, Haojie ; Shu, Yanfeng ; Li, Qing ; Wang, Hu
Ma, Congbo
Zhang, Wei
Pitawela, Dileepa
Zhuang, Haojie
Shu, Yanfeng
Li, Qing
Wang, Hu
Supervisor
Department
Computer Vision
Embargo End Date
Type
Conference proceeding
Date
2026
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
The utilization of Transformer-based models prospers the growth of multi-document summarization (MDS). Given the huge impact and widespread adoption of Transformer-based models in various natural language processing tasks, investigating their performance and behaviors in the context of MDS becomes crucial for advancing the field and enhancing the quality of summary. To thoroughly examine the behaviours of Transformer-based MDS models, this paper presents five empirical studies on (1) measuring the impact of document boundary separators quantitatively; (2) exploring the effectiveness of different mainstream Transformer structures; (3) examining the sensitivity of the encoder and decoder; (4) discussing different training strategies; and (5) discovering the repetition in a summary generation. The experimental results on prevalent MDS datasets and eleven evaluation metrics show the influence of document boundary separators, the granularity of different level features and different model training strategies. The results also reveal that the decoder exhibits greater sensitivity to noises compared to the encoder. This underscores the important role played by the decoder, suggesting a potential direction for future research in MDS. Furthermore, the experimental results indicate that the repetition problem in the generated summaries has correlations with the high uncertainty scores.
Citation
C. Ma et al., “Rethinking Transformer-Based Multi-Document Summarization: An Empirical Investigation,” Lecture Notes in Computer Science, vol. 16198, pp. 3–18, 2026, doi: 10.1007/978-981-95-3456-2_1
Source
Lecture Notes in Computer Science
Conference
21st International Conference on Advanced Data Mining and Applications, ADMA 2025
Keywords
Empirical Investigation, Multi-document summarization, Transformer
Subjects
Source
21st International Conference on Advanced Data Mining and Applications, ADMA 2025
Publisher
Springer Nature
