| ||||
| ||||
![]() Title:HiGraph-LLM: Hierarchical Graph Encoding and Integration with Large Language Models Authors:Zhen Cai, Yanhua Yu, Xidian Wang, Kangkang Lu, Tu Ao, Mingliang Yan, Liang Pang, Pinghui Wang and Tat-Seng Chua Conference:PRICAI 2025 Tags:Graph Neural Networks, Graph Structure Learning and Large Language Models Abstract: Graph Neural Networks (GNNs) have achieved remarkable performance on graph-centric tasks such as node classification and link prediction. Meanwhile, Large Language Models (LLMs) have shown impressive performance in language understanding across diverse domains. GNNs effectively capture structural information but struggle with rich semantic modeling, while LLMs offer strong contextual reasoning yet fail to encode graph topology. This dual challenge necessitates addressing both the inherent limitations in node representation learning and the complexities involved in aligning graph-structured data with the token space of LLMs. To address these challenges, we introduce HiGraph-LLM, a novel framework designed for hierarchical graph encoding and integration with large language models. HiGraph-LLM refines node representations by integrating multi-level structural features and aligns them with LLMs through curriculum-driven prompt learning. Specifically, HiGraph-LLM consists of two modules: the Hierarchical Node Information Learning Module, which effectively consolidates information from hierarchical node levels to improve node representations, and the LLM’s Graph Information Integration Module, which optimizes the alignment of graph data with the LLM. Comprehensive experiments on multiple benchmark datasets demonstrate the effectiveness of our proposed method. The code will be released upon acceptance of the paper. HiGraph-LLM: Hierarchical Graph Encoding and Integration with Large Language Models ![]() HiGraph-LLM: Hierarchical Graph Encoding and Integration with Large Language Models | ||||
| Copyright © 2002 – 2025 EasyChair |
