| ||||
| ||||
![]() Title:LLM-Distilled Surrogate Model for Expensive Multi-Objective Optimization Conference:PRICAI 2025 Tags:Evolutionary Algorithms, Knowledge Distillation and Synthetic Data Abstract: In expensive multi-objective optimization problems (EMOPs), surrogate-assisted evolutionary algorithms (SAEAs) are among the most widely used solutions. However, surrogate models often suffer from degraded performance due to limited training data—a prevalent and critical challenge in this domain. To address this issue, we propose a novel framework named DuSiM that leverages the capabilities of Large Language Models (LLMs) to assist surrogate model training. Specifically, DuSiM uses LLMs to generate additional high-quality training data, which enhances the surrogate model’s approximation accuracy despite the scarcity of evaluated training data. Specifically, DuSiM first uses the surrogate model to guide the prompt-feedback tuning of the LLM. Once the LLM adapts to predicting evaluation function values and uncertainties, it subsequently generates a substantial amount of high-quality synthetic data to assist in training the surrogate model. To evaluate the effectiveness of DuSiM, we compare it with five state-of-the-art algorithms on various problems. Experimental results demonstrate that our framework can accelerate the convergence of SAEAs and outperforms other algorithms in most cases. LLM-Distilled Surrogate Model for Expensive Multi-Objective Optimization ![]() LLM-Distilled Surrogate Model for Expensive Multi-Objective Optimization | ||||
| Copyright © 2002 – 2025 EasyChair |
