Self-guiding exploration for combinatorial problems
Iklassov, Zangir ; Du, Yali ; Akimov, Farkhad ; Takac, Martin
Iklassov, Zangir
Du, Yali
Akimov, Farkhad
Takac, Martin
Supervisor
Department
Machine Learning
Embargo End Date
Type
Conference proceeding
Date
2024
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Large Language Models (LLMs) have become pivotal in addressing reasoning tasks across diverse domains, including arithmetic, commonsense, and symbolic reasoning. They utilize prompting techniques such as Exploration-of-Thought, Decomposition, and Refinement to effectively navigate and solve intricate tasks. Despite these advancements, the application of LLMs to Combinatorial Problems (CPs), known for their NP-hardness and critical roles in logistics and resource management remains underexplored. To address this gap, we introduce a novel prompting strategy: Self-Guiding Exploration (SGE), designed to enhance the performance of solving CPs. SGE operates autonomously, generating multiple thought trajectories for each CP task. It then breaks these trajectories down into actionable subtasks, executes them sequentially, and refines the results to ensure optimal outcomes. We present our research as the first to apply LLMs to a broad range of CPs and demonstrate that SGE outperforms existing prompting strategies by over 27.84% in CP optimization performance. Additionally, SGE achieves a 2.46% higher accuracy over the best existing results in other reasoning tasks (arithmetic, commonsense, and symbolic).
Citation
Z. Iklassov MBZUAI, Y. Du, F. Akimov MBZUAI, and M. Taká? MBZUAI, “Self-Guiding Exploration for Combinatorial Problems,” Adv Neural Inf Process Syst, vol. 37, pp. 130569–130601, Dec. 2024.
Source
Advances in Neural Information Processing Systems (NeurIPS 2024)
Conference
Keywords
Self-Guiding Exploration (SGE), Combinatorial problems (CPs), Large Language Models (LLMs), Prompting strategies, Optimization performance
Subjects
Source
Publisher
NEURIPS
