Skip to content

This is the official implementation for "Do Transformers Really Perform Bad for Graph Representation?".

License

Notifications You must be signed in to change notification settings

lzx325/Graphormer

 
 

Repository files navigation

Graphormer

By Chengxuan Ying, Tianle Cai, Shengjie Luo, Shuxin Zheng*, Guolin Ke, Di He*, Yanming Shen and Tie-Yan Liu.

This repo is the official implementation of "Do Transformers Really Perform Bad for Graph Representation?".

News

08/03/2021

  1. Codes and scripts are released.

06/16/2021

  1. Graphormer has won the 1st place of quantum prediction track of Open Graph Benchmark Large-Scale Challenge (KDD CUP 2021) [Competition Description] [Competition Result] [Technical Report] [Blog (English)] [Blog (Chinese)]

Introduction

Graphormer is initially described in arxiv, which is a standard Transformer architecture with several structural encodings, which could effectively encoding the structural information of a graph into the model.

Graphormer achieves strong performance on PCQM4M-LSC (0.1234 MAE on val), MolPCBA (31.39 AP(%) on test), MolHIV (80.51 AUC(%) on test) and ZINC (0.122 MAE on test), surpassing previous models by a large margin.

Main Results

PCQM4M-LSC

Method #params train MAE valid MAE
GCN 2.0M 0.1318 0.1691
GIN 3.8M 0.1203 0.1537
GCN-VN 4.9M 0.1225 0.1485
GIN-VN 6.7M 0.1150 0.1395
Graphormer-Small 12.5M 0.0778 0.1264
Graphormer 47.1M 0.0582 0.1234

OGBG-MolPCBA

Method #params test AP (%)
DeeperGCN-VN+FLAG 5.6M 28.42
DGN 6.7M 28.85
GINE-VN 6.1M 29.17
PHC-GNN 1.7M 29.47
GINE-APPNP 6.1M 29.79
Graphormer 119.5M 31.39

OGBG-MolHIV

Method #params test AP (%)
GCN-GraphNorm 526K 78.83
PNA 326K 79.05
PHC-GNN 111K 79.34
DeeperGCN-FLAG 532K 79.42
DGN 114K 79.70
Graphormer 47.0M 80.51

ZINC-500K

Method #params test MAE
GIN 509.5K 0.526
GraphSage 505.3K 0.398
GAT 531.3K 0.384
GCN 505.1K 0.367
GT 588.9K 0.226
GatedGCN-PE 505.0K 0.214
MPNN (sum) 480.8K 0.145
PNA 387.2K 0.142
SAN 508.6K 0.139
Graphormer-Slim 489.3K 0.122

Requirements and Installation

Setup with Conda

# create a new environment
conda create --name graphormer python=3.7
conda activate graphormer
# install requirements
pip install rdkit-pypi cython
pip install ogb==1.3.1 pytorch-lightning==1.3.0
pip install torch==1.7.1+cu110 torchvision==0.8.2+cu110 -f https://download.pytorch.org/whl/torch_stable.html
pip install torch-geometric==1.6.3 ogb==1.3.1 pytorch-lightning==1.3.1 tqdm torch-sparse==0.6.9 torch-scatter==2.0.6 -f https://pytorch-geometric.com/whl/torch-1.7.0+cu110.html

Citation

Please kindly cite this paper if you use the code:

@article{ying2021transformers,
  title={Do Transformers Really Perform Bad for Graph Representation?},
  author={Ying, Chengxuan and Cai, Tianle and Luo, Shengjie and Zheng, Shuxin and Ke, Guolin and He, Di and Shen, Yanming and Liu, Tie-Yan},
  journal={arXiv preprint arXiv:2106.05234},
  year={2021}
}

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

About

This is the official implementation for "Do Transformers Really Perform Bad for Graph Representation?".

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 93.6%
  • Cython 6.4%