IMCGNN: Information Maximization based Continual Graph Neural Networks for inductive node classification

Research output: Contribution to journalArticlepeer-review

Abstract

Continual graph learning is an emerging topic that enables models to incrementally acquire new knowledge while retaining prior experiences. It efficiently adapts to model evolving dynamic graphs, avoiding the computational burden of training from scratch. The key distinction of CGL from conventional continual learning is the interdependence of samples in graph-structured data versus the independence in conventional learning. Consequently, continual graph learning techniques should emphasize consolidating and leveraging the topological information in graph-structured data. Current methods inadequately address this need. Some approaches ignore topological information, resulting in significant information loss. Others attempt to preserve all learned information, leading to overly conservative models. Moreover, most of these methods employ graph neural networks (GNNs) as the base model, yet they fail to fully utilize the topological information learned by GNNs. Additionally, the majority of existing works focus on transductive setting, with inductive continual graph learning problems being scarcely explored. Our proposed Information Maximization based Continual Graph Neural Network (IMCGNN) focuses on inductive task-incremental node classification problems. This proposed work involves a replay module and a regularization module. The former extracts representative subgraphs from previous data, training them jointly with new data to retain historical experiences, whereas the latter preserves topological information and loss-related information with encoded knowledge by imposing elastic penalties on network parameters. Unlike heuristic node selection, our approach utilizes the information theory to guide node selection in forming a subgraph, aiming to preserve information better. Comparative experiments with nine baselines using two graph learning models on five benchmark datasets demonstrate the effectiveness and efficiency of our method.
Original languageEnglish
Article number129362
Number of pages11
JournalNeurocomputing
Volume624
Publication statusPublished - Apr 2025

Keywords

  • Continual graph learning, Experience replay, Deep learning

Fingerprint

Dive into the research topics of 'IMCGNN: Information Maximization based Continual Graph Neural Networks for inductive node classification'. Together they form a unique fingerprint.

Cite this