Repository for "Integrative Graph-Transformer Framework for Histopathology Whole Slide Image Representation and Classification""
-
Updated
May 14, 2024
Repository for "Integrative Graph-Transformer Framework for Histopathology Whole Slide Image Representation and Classification""
VCR-Graphormer: A Mini-batch Graph Transformer via Virtual Connections, ICLR 2024
Welcome to the Graph Neural Networks (06838-01) class repository for the Department of Artificial Intelligence at the Catholic University of Korea. This platform is dedicated to sharing and archiving lecture materials such as practices, assignments, and sample codes for the class.
Triplet Graph Transformer
An unofficial implementation of Graph Transformer (Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021
Unified Graph Transformer (UGT) is a novel Graph Transformer model specialised in preserving both local and global graph structures and developed by NS Lab @ CUK based on pure PyTorch backend.
Community-aware Graph Transformer (CGT) is a novel Graph Transformer model that utilizes community structures to address node degree biases in message-passing mechanism and developed by NS Lab @ CUK based on pure PyTorch backend.
[NeurIPS'23 Spotlight] Learning Probabilistic Symmetrization for Architecture Agnostic Equivariance (LPS), in PyTorch
Hop-Wise Graph Attention for Scalable and Generalizable Learning on Circuits
Papers about graph transformers.
[AAAI'23] MulGT: Multi-task Graph-Transformer with Task-aware Knowledge Injection and Domain Knowledge-driven Pooling for Whole Slide Image Analysis
[MICCAI'23] HIGT: Hierarchical Interaction Graph-Transformer for Whole Slide Image Analysis
It is a comprehensive resource hub compiling all graph papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
Test graph isomorphism with 1-WL for different graph classes and labelings
[ICDE'2023] When Spatio-Temporal Meet Wavelets: Disentangled Traffic Forecasting via Efficient Spectral Graph Attention Networks
MANDO-HGT is a framework for detecting smart contract vulnerabilities. Given either in source code or bytecode forms, MANDO-HGT adapts heterogeneous graph transformers with customized meta relations for graph nodes and edges to learn their embeddings and train classifiers for detecting various vulnerability types in the contracts' nodes and graphs.
The official implementation of NeurIPS22 spotlight paper "NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification"
Protein Structure Transformer (PST): Endowing pretrained protein language models with structural knowledge
Codebase of paper "Balancing structure and position information in Graph Transformer network with a learnable node embedding"
Pretraining Techniques for Graph Transformers
Add a description, image, and links to the graph-transformer topic page so that developers can more easily learn about it.
To associate your repository with the graph-transformer topic, visit your repo's landing page and select "manage topics."