Item

Less but Better: Parameter-Efficient Fine-Tuning of Large Language Models for Personality Detection

Shen, Lingzhi
Long, Yunfei
Cai, Xiaohao
Chen, Guanming
Razzak, Muhammad Imran
Jameel, Shoaib
Supervisor
Department
Computational Biology
Embargo End Date
Type
Conference proceeding
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Personality detection automatically identifies an individual's personality from various data sources, such as social media texts. However, as the parameter scale of language models continues to grow, the computational cost becomes increasingly difficult to manage. Fine-tuning also grows more complex, making it harder to justify the effort and reliably predict outcomes. We introduce a novel parameter-efficient fine-tuning framework, PersLLM, to address these challenges. In PersLLM, a large language model (LLM) extracts high-dimensional representations from raw data and stores them in a dynamic memory layer. PersLLM then updates the downstream layers with a replaceable output network, enabling flexible adaptation to various personality detection scenarios. By storing the features in the memory layer, we eliminate the need for repeated complex computations by the LLM. Meanwhile, the lightweight output network serves as a proxy for evaluating the overall effectiveness of the framework, improving the predictability of results. Experimental results on key benchmark datasets like Kaggle and Pandora show that PersLLM significantly reduces computational cost while maintaining competitive performance and strong adaptability.
Citation
L. Shen, Y. Long, X. Cai, G. Chen, I. Razzak and S. Jameel, "Less but Better: Parameter-Efficient Fine-Tuning of Large Language Models for Personality Detection," 2025 International Joint Conference on Neural Networks (IJCNN), Rome, Italy, 2025, pp. 1-8, doi: 10.1109/IJCNN64981.2025.11228339.
Source
Proceedings of the International Joint Conference on Neural Networks (IJCNN)
Conference
2025 International Joint Conference on Neural Networks, IJCNN 2025
Keywords
Large Language Models, Parameter-Efficient Fine-Tuning, Personality Detection, Text Classification
Subjects
Source
2025 International Joint Conference on Neural Networks, IJCNN 2025
Publisher
IEEE
Full-text link