Skip to content
/ CGC Public

WWW'25 Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition.

Notifications You must be signed in to change notification settings

XYGaoG/CGC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Class-partitioned Graph Condensation (CGC)

Introduction

This repository is the implementation of WWW 2025 paper: Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition.

CGC improves the class-to-class matching paradigm of existing graph condensation methods to the novel class-to-node matching paradigm, therefore achieving an exceedingly efficient condensation process with advanced accuracy.

GC

For more works about graph condensation, please refer to our TKDE'25 survey paper 🔥Graph Condensation: A Survey and paper list Graph Condensation Papers.

Requirements

All experiments are implemented in Python 3.9 with Pytorch 1.12.1.

To install requirements:

pip install -r requirements.txt

Datasets

  • For Cora and Citeseer, they will be downloaded from PyG.
  • Ogbn-products will be downloaded from OGB.
  • For Ogbn-arxiv, Flickr and Reddit, we use the datasets provided by GraphSAINT. They are available on Google Drive link (alternatively, BaiduYun link (code: f1ao)). Rename the folder to ./dataset/ at the root directory. Note that the links are provided by GraphSAINT team.

Condensation and Model Training

To condense the graph using CGC and train GCN models:

$ python main.py --gpu 0 --dataset reddit --ratio 0.001 --generate_adj 1

For more efficient graphless variant CGC-X:

$ python main.py --gpu 0 --dataset reddit --ratio 0.001 --generate_adj 0

All scripts of different condensation ratios are provided in run.sh.

Results will be saved in ./results/.

Aknowledgements

We express our gratitude for the contributions of GGond and FAISS in their remarkable work.

Citation

@inproceedings{gao2025rethinking,
 title={Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition},
 author={Gao, Xinyi and Ye, Guanhua and Chen, Tong and Zhang, Wentao and Yu, Junliang and Yin, Hongzhi},
 booktitle={Proceedings of the ACM on Web Conference 2025},
 year={2025}
}

About

WWW'25 Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition.

Resources

Stars

Watchers

Forks