Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
-
Updated
Jul 27, 2021 - Python
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Papers about graph transformers.
Recipe for a General, Powerful, Scalable Graph Transformer
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
The official implementation of NeurIPS22 spotlight paper "NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification"
The official implementation for ICLR23 spotlight paper "DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained Diffusion"
[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Official Pytorch code for Structure-Aware Transformer.
[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
Deep learning toolkit for Drug Design with Pareto-based Multi-Objective optimization in Polypharmacology
Code for AAAI2020 paper "Graph Transformer for Graph-to-Sequence Learning"
Repository for CARTE: Context-Aware Representation of Table Entries
Long Range Graph Benchmark, NeurIPS 2022 Track on D&B
[ICDE'2023] When Spatio-Temporal Meet Wavelets: Disentangled Traffic Forecasting via Efficient Spectral Graph Attention Networks
Official Code Repository for the paper "Accurate Learning of Graph Representations with Graph Multiset Pooling" (ICLR 2021)
SignNet and BasisNet
It is a comprehensive resource hub compiling all graph papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
Code for our paper "Attending to Graph Transformers"
[TNNLS-2025] This is the pytorch implementation of EmT, a graph-transformer for EEG emotion recognition.
Add a description, image, and links to the graph-transformer topic page so that developers can more easily learn about it.
To associate your repository with the graph-transformer topic, visit your repo's landing page and select "manage topics."