QUIDS: Query Intent Description for Exploratory Search via Dual Space Modeling
Wang, Yumeng ; Chen, Xiuying ; Verberne, Suzan
Wang, Yumeng
Chen, Xiuying
Verberne, Suzan
Supervisor
Department
Natural Language Processing
Embargo End Date
Type
Conference proceeding
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
In exploratory search, users often submit vague queries to investigate unfamiliar topics, but receive limited feedback about how the search engine understood their input. This leads to a self-reinforcing cycle of mismatched results and trial-and-error reformulation. To address this, we study the task of generating user-facing natural language query intent descriptions that surface what the system likely inferred the query to mean, based on post-retrieval evidence. We propose QUIDS, a method that leverages dual-space contrastive learning to isolate intent-relevant information while suppressing irrelevant content. QUIDS combines a dual-encoder representation space with a disentangling decoder that works together to produce concise and accurate intent descriptions. Enhanced by intent-driven hard negative sampling, the model significantly outperforms state-of-the-art baselines across ROUGE, BERTScore, and human/LLM evaluations. Our qualitative analysis confirms QUIDS’ effectiveness in generating accurate intent descriptions for exploratory search. Our work contributes to improving the interaction between users and search engines by providing feedback to the user in exploratory search settings.
Citation
Y. Wang, X. Chen, and S. Verberne, “QUIDS: Query Intent Description for Exploratory Search via Dual Space Modeling,” Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pp. 33050–33065, 2025, doi: 10.18653/V1/2025.EMNLP-MAIN.1680
Source
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Conference
2025 Conference on Empirical Methods in Natural Language Processing
Keywords
Query Intent Description, Dual-Space Contrastive Learning, Dual-Encoder Representation, Disentangling Decoder, Intent-Driven Hard Negative Sampling, Post-Retrieval Feedback, Search Engine Transparency
Subjects
Source
2025 Conference on Empirical Methods in Natural Language Processing
Publisher
Association for Computational Linguistics
