GNN User Group: meeting 3
Third meeting of GNN user group will include talks from Marinka Zitnik, Kexin Huang, and Xavier Bresson, talking about GNNs for therapeutics and combinatorial optimization. It will be tomorrow, 25th March.
Third meeting of GNN user group will include talks from Marinka Zitnik, Kexin Huang, and Xavier Bresson, talking about GNNs for therapeutics and combinatorial optimization. It will be tomorrow, 25th March.
Eventbrite
Graph Neural Networks User Group
Model distillation for GNNs
Model distillation is the approach to train a small neural network called student given a large pretrained neural network called teacher. Motivation for this is that you want to reduce the number of parameters of your production model as much as possible, while keeping the quality of your solution. One of the first approaches for this was by Geoffrey Hinton, Oriol Vinyals, Jeff Dean (what a combo) who proposed to train student network on the logits of the teacher network. Since then, a huge amount of losses has appeared that attempt to improve performance of student network, but the original approach by Hinton et al. still works reasonably well. A good survey is this recent one.
Surprisingly, there were not many papers on model distillation for GNNs. Here are a few examples:
* Reliable Data Distillation on Graph Convolutional Network SIGMOD 2020
* Distilling Knowledge from Graph Convolutional Networks CVPR 2020
* Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework WWW 21
But these approaches were not convincing enough for me to say that knowledge distillation is solved for GNNs, so I'd say it's still an open question to research. I have also tried to train MLP model on GNN logits to see if we can replace GNN with MLP at inference time, and apparently you can get an uplift wrt vanilla MLP trained on targets; however, the performance is not as good as for GNNs. One of the good examples of significantly reducing the number of parameters of GNNs is the recent work on LP for node classification: LP has 0 parameters and with C&S it gets some MLP parameters but not as many as for GNNs.
Model distillation is the approach to train a small neural network called student given a large pretrained neural network called teacher. Motivation for this is that you want to reduce the number of parameters of your production model as much as possible, while keeping the quality of your solution. One of the first approaches for this was by Geoffrey Hinton, Oriol Vinyals, Jeff Dean (what a combo) who proposed to train student network on the logits of the teacher network. Since then, a huge amount of losses has appeared that attempt to improve performance of student network, but the original approach by Hinton et al. still works reasonably well. A good survey is this recent one.
Surprisingly, there were not many papers on model distillation for GNNs. Here are a few examples:
* Reliable Data Distillation on Graph Convolutional Network SIGMOD 2020
* Distilling Knowledge from Graph Convolutional Networks CVPR 2020
* Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework WWW 21
But these approaches were not convincing enough for me to say that knowledge distillation is solved for GNNs, so I'd say it's still an open question to research. I have also tried to train MLP model on GNN logits to see if we can replace GNN with MLP at inference time, and apparently you can get an uplift wrt vanilla MLP trained on targets; however, the performance is not as good as for GNNs. One of the good examples of significantly reducing the number of parameters of GNNs is the recent work on LP for node classification: LP has 0 parameters and with C&S it gets some MLP parameters but not as many as for GNNs.
GitHub
GitHub - HobbitLong/RepDistiller: [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge…
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods - HobbitLong/RepDistiller
Tensorflow GNN libraries
If I miss some libraries in tensorflow, please let me know, I will update the list.
* tf_geometric (paper)
* tf2-gnn (microsoft)
* tf-gnn-samples (microsoft)
* Spektral (documentation)
* graph_nets (deepmind)
* gnn (documentation)
If I miss some libraries in tensorflow, please let me know, I will update the list.
* tf_geometric (paper)
* tf2-gnn (microsoft)
* tf-gnn-samples (microsoft)
* Spektral (documentation)
* graph_nets (deepmind)
* gnn (documentation)
GitHub
GitHub - CrawlScript/tf_geometric: Efficient and Friendly Graph Neural Network Library for TensorFlow 1.x and 2.x
Efficient and Friendly Graph Neural Network Library for TensorFlow 1.x and 2.x - CrawlScript/tf_geometric
GNN Explainer UI
Awesome tool that provides user interface for visualizing edge attributions on trained GNN models and compare different explanation methods. An explanation method takes as input a GNN model and a single sample graph and outputs attribution values for all the edges in the graph. Each explanation method uses a different approach for calculating how important each edge is and it is important to evaluate explanation methods as well.
Awesome tool that provides user interface for visualizing edge attributions on trained GNN models and compare different explanation methods. An explanation method takes as input a GNN model and a single sample graph and outputs attribution values for all the edges in the graph. Each explanation method uses a different approach for calculating how important each edge is and it is important to evaluate explanation methods as well.
Fresh picks from ArXiv
This week on ArXiv: tricks to improve GNNs, unlearning problem on graphs, and cheating on TOEFL with GNNs ✍️
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* A nonlinear diffusion method for semi-supervised learning on hypergraphs with Austin R. Benson
* Bag of Tricks of Semi-Supervised Classification with Graph Neural Networks
* Self-supervised Graph Neural Networks without explicit negative sampling
* Graph Unlearning
* InsertGNN: Can Graph Neural Networks Outperform Humans in TOEFL Sentence Insertion Problem?
* Beyond permutation equivariance in graph networks
* Knowledge-aware Contrastive Molecular Graph Learning
* Autism Spectrum Disorder Screening Using Discriminative Brain Sub-Networks: An Entropic Approach
Survey
* A Comprehensive Survey on Knowledge Graph Entity Alignment via Representation Learning
This week on ArXiv: tricks to improve GNNs, unlearning problem on graphs, and cheating on TOEFL with GNNs ✍️
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* A nonlinear diffusion method for semi-supervised learning on hypergraphs with Austin R. Benson
* Bag of Tricks of Semi-Supervised Classification with Graph Neural Networks
* Self-supervised Graph Neural Networks without explicit negative sampling
* Graph Unlearning
* InsertGNN: Can Graph Neural Networks Outperform Humans in TOEFL Sentence Insertion Problem?
* Beyond permutation equivariance in graph networks
* Knowledge-aware Contrastive Molecular Graph Learning
* Autism Spectrum Disorder Screening Using Discriminative Brain Sub-Networks: An Entropic Approach
Survey
* A Comprehensive Survey on Knowledge Graph Entity Alignment via Representation Learning
Video and slides: GNN User Group meeting 3
In the third meeting of GNN user group, there are two talks:
* Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics by Marinka Zitnik and Kexin Huang (Harvard)
* The Transformer Network for TSP by Xavier Bresson (NTU)
Slides are available in their slack channel.
In the third meeting of GNN user group, there are two talks:
* Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics by Marinka Zitnik and Kexin Huang (Harvard)
* The Transformer Network for TSP by Xavier Bresson (NTU)
Slides are available in their slack channel.
YouTube
Graph Neural Networks User Group Meeting on Mar 25, 2021
Agenda
4:00 - 4:30 (PST): Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics
Abstract: Machine learning (ML) for therapeutics is an emerging field with incredible opportunities for innovation and expansion. Despite the initial…
4:00 - 4:30 (PST): Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics
Abstract: Machine learning (ML) for therapeutics is an emerging field with incredible opportunities for innovation and expansion. Despite the initial…
Pytorch Geometric tutorial
Awesome tutorials on how to program your GNNs with PyTorch Geometric. I often say that the best way to learn about GNNs is through coding, so if you are new I would definitely recommend checking it out. There are upcoming sessions soon, if you want to do it live.
Awesome tutorials on how to program your GNNs with PyTorch Geometric. I often say that the best way to learn about GNNs is through coding, so if you are new I would definitely recommend checking it out. There are upcoming sessions soon, if you want to do it live.
YouTube
PyTorch Geometric tutorial: Graph Autoencoders & Variational Graph Autoencoders
In this tutorial, we present Graph Autoencoders and Variational Graph Autoencoders from the paper:
https://arxiv.org/pdf/1611.07308.pdf
Later, we show an example taken from the official PyTorch geometric repository:
https://github.com/rusty1s/pytorch_g…
https://arxiv.org/pdf/1611.07308.pdf
Later, we show an example taken from the official PyTorch geometric repository:
https://github.com/rusty1s/pytorch_g…
Graph Machine Learning research groups: Mingyuan Zhou
I do a series of posts on the groups in graph research, previous post is here. The 26th is Mingyuan Zhou, a professor at the University of Texas, who has been working on statistical aspects of GNNs.
Mingyuan Zhou (~1985)
- Affiliation: The University of Texas at Austin
- Education: Ph.D. at Duke University in 2013 (advisors: Lawrence Carin)
- h-index 30
- Interests: hyperbolic graph embeddings, bayesian GNNs, graph auto-encoders
I do a series of posts on the groups in graph research, previous post is here. The 26th is Mingyuan Zhou, a professor at the University of Texas, who has been working on statistical aspects of GNNs.
Mingyuan Zhou (~1985)
- Affiliation: The University of Texas at Austin
- Education: Ph.D. at Duke University in 2013 (advisors: Lawrence Carin)
- h-index 30
- Interests: hyperbolic graph embeddings, bayesian GNNs, graph auto-encoders
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Yaron Lipman
I do a series of posts on the groups in graph research, previous post is here. The 25th is Yaron Lipman, a professor in Israel, who has been co-authoring many papers on equivariances and the power of GNNs.…
I do a series of posts on the groups in graph research, previous post is here. The 25th is Yaron Lipman, a professor in Israel, who has been co-authoring many papers on equivariances and the power of GNNs.…
Insights from Physics on Graphs and Relational Bias
A great lecture with lots of insights by Kyle Cranmer on the inductive biases involved in physics. Applying GNNs to life science problems is one of the biggest trends for ML and it's exciting to see more and more cool results in this area.
A great lecture with lots of insights by Kyle Cranmer on the inductive biases involved in physics. Applying GNNs to life science problems is one of the biggest trends for ML and it's exciting to see more and more cool results in this area.
YouTube
Graph Deep Learning 2021 - Kyle Cranmer - GNNs in physics
This guest lecture was part of the 2021 Graph Deep Learning course at Università della Svizzera italiana (https://www.usi.ch/). Prof. Cranmer talks about how relational inductive biases can be useful to describe physical systems.
Bio (from Wikipedia):
…
Bio (from Wikipedia):
…
Open Research Problems in Graph ML
I thought I would make my first subscriber-only post on the open research problems in graph ML. These are the problems that I have thought a lot and think can have a transformational impact not only on this field, but also on the applications of graph models to other areas.
I thought I would make my first subscriber-only post on the open research problems in graph ML. These are the problems that I have thought a lot and think can have a transformational impact not only on this field, but also on the applications of graph models to other areas.
Substack
GML Subscribers - Open Research Problems in graph ML community
"Exploration is curiosity put into action."
Fresh picks from ArXiv
This week on ArXiv: new heterophily datasets, improved inference and expressive power for GNNs 🦹
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* New Benchmarks for Learning on Non-Homophilous Graphs
* Adaptive Filters and Aggregator Fusion for Efficient Graph Convolutions with Pietro Liò
* Improving the Expressive Power of Graph Neural Network with Tinhofer Algorithm
* Sub-GMN: The Subgraph Matching Network Model
* SST-GNN: Simplified Spatio-temporal Traffic forecasting model using Graph Neural Network
Math
* Using Graph Theory to Derive Inequalities for the Bell Numbers
Software
* LAGraph: Linear Algebra, Network Analysis Libraries, and the Study of Graph Algorithms
Survey
* Scene Graphs: A Survey of Generations and Applications
This week on ArXiv: new heterophily datasets, improved inference and expressive power for GNNs 🦹
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* New Benchmarks for Learning on Non-Homophilous Graphs
* Adaptive Filters and Aggregator Fusion for Efficient Graph Convolutions with Pietro Liò
* Improving the Expressive Power of Graph Neural Network with Tinhofer Algorithm
* Sub-GMN: The Subgraph Matching Network Model
* SST-GNN: Simplified Spatio-temporal Traffic forecasting model using Graph Neural Network
Math
* Using Graph Theory to Derive Inequalities for the Bell Numbers
Software
* LAGraph: Linear Algebra, Network Analysis Libraries, and the Study of Graph Algorithms
Survey
* Scene Graphs: A Survey of Generations and Applications
Topological GNNs and graph models for video
Two articles recently were featured on Synced. One is Making GNNs ‘Topology-Aware’ to Advance their Expressive Power about using persistent homology for additional expressivity of GNNs. Another is SOTA GNN ‘Reasons’ Interactions over Time to Boost Video Understanding about modeling images as graphs to reason about their content.
Two articles recently were featured on Synced. One is Making GNNs ‘Topology-Aware’ to Advance their Expressive Power about using persistent homology for additional expressivity of GNNs. Another is SOTA GNN ‘Reasons’ Interactions over Time to Boost Video Understanding about modeling images as graphs to reason about their content.
Synced | AI Technology & Industry Review
Making GNNs ‘Topology-Aware’ to Advance their Expressive Power: New Paper from ETH, SIB & KU Leuven | Making GNNs 'Topology-Aware'…
A research team from ETH Zurich, SIB Swiss Institute of Bioinformatics, and KU Leuven proposes Topological Graph Layer (TOGL), a new type of graph neural network layer capable of leveraging the multi-scale topological information of input graphs.Graph neural…
Adaptive Filters and Aggregator Fusion for Efficient Graph Convolutions
A blog post by Shyam A. Tailor about a simple modification of GCN layer that is both more efficient and more effective than many standard message-passing algorithms.
A blog post by Shyam A. Tailor about a simple modification of GCN layer that is both more efficient and more effective than many standard message-passing algorithms.
The London Geometry and Machine Learning Summer School 2021
A very cool one week school on geometric deep learning, happening online this summer. Early career researchers such as Ph.D. students will work in small groups under the guidance of experienced mentors on a research project. Applications are open until 31 May 2021.
A very cool one week school on geometric deep learning, happening online this summer. Early career researchers such as Ph.D. students will work in small groups under the guidance of experienced mentors on a research project. Applications are open until 31 May 2021.
www.logml.ai
LOGML 2025
London Geometry and Machine Learning Summer School, July 7-11 2025
Bag of Tricks for Semi-Supervised classification
There is a nice short paper on tricks employed on improving performance of GNN. The author, Yangkun Wang, from DGL team has a lot of high scoring entries in the OGB leaderboard, so it's worth employing these tricks: they boost performance a bit but do it consistently. The tricks include:
* data augmentation
* using labels as node features
* renormalization of adjacency matrix
* novel loss functions
* residual connections from the input
There is a nice short paper on tricks employed on improving performance of GNN. The author, Yangkun Wang, from DGL team has a lot of high scoring entries in the OGB leaderboard, so it's worth employing these tricks: they boost performance a bit but do it consistently. The tricks include:
* data augmentation
* using labels as node features
* renormalization of adjacency matrix
* novel loss functions
* residual connections from the input
Open Graph Benchmark
Leaderboards for Node Property Prediction
Check leaderboards for - ogbn-products - ogbn-proteins - ogbn-arxiv - ogbn-papers100M - ogbn-mag
Mathematicians Settle Erdős Coloring Conjecture
Erdős-Faber-Lovász conjecture states that the minimum number of colors necessary to shade the edges of a hypergraphs so that no overlapping edges have the same color is bounded by the number of vertices. After 50 years of research it has finally been resolved.
Erdős-Faber-Lovász conjecture states that the minimum number of colors necessary to shade the edges of a hypergraphs so that no overlapping edges have the same color is bounded by the number of vertices. After 50 years of research it has finally been resolved.
Quanta Magazine
Mathematicians Settle Erdős Coloring Conjecture
Fifty years ago, Paul Erdős and two other mathematicians came up with a graph theory problem that they thought they might solve on the spot. A team of mathematicians has finally settled it.
Fresh picks from ArXiv
This week on ArXiv: improved power of GNNs, new autoML library for graphs, and decreasing query time for graph traversal 🕔
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Theoretically Improving Graph Neural Networks via Anonymous Walk Graph Kernels
* A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs
* Learning to Coordinate via Multiple Graph Neural Networks
* DyGCN: Dynamic Graph Embedding with Graph Convolutional Network
* Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs with Philip S. Yu
* The World as a Graph: Improving El Niño Forecasts with Graph Neural Networks
kNN
* Graph Reordering for Cache-Efficient Near Neighbor Search with Alex Smola
Software
* AutoGL: A Library for Automated Graph Learning
Survey
* Representation Learning for Networks in Biology and Medicine: Advancements, Challenges, and Opportunities with Marinka Zitnik
This week on ArXiv: improved power of GNNs, new autoML library for graphs, and decreasing query time for graph traversal 🕔
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Theoretically Improving Graph Neural Networks via Anonymous Walk Graph Kernels
* A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs
* Learning to Coordinate via Multiple Graph Neural Networks
* DyGCN: Dynamic Graph Embedding with Graph Convolutional Network
* Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs with Philip S. Yu
* The World as a Graph: Improving El Niño Forecasts with Graph Neural Networks
kNN
* Graph Reordering for Cache-Efficient Near Neighbor Search with Alex Smola
Software
* AutoGL: A Library for Automated Graph Learning
Survey
* Representation Learning for Networks in Biology and Medicine: Advancements, Challenges, and Opportunities with Marinka Zitnik
Outlier detection and description workshop at KDD 2021
Graph methods are very popular in detecting fraud as they are capable to distinguish interactions of fraudsters from benign users. There is a big workshop at KDD 2021 about detecting and describing outliers, with a great list of keynote speakers.
Graph methods are very popular in detecting fraud as they are capable to distinguish interactions of fraudsters from benign users. There is a big workshop at KDD 2021 about detecting and describing outliers, with a great list of keynote speakers.
oddworkshop.github.io
ODD 2021 - 6th Outlier Detection and Description Workshop
ODD 2021, 6th Outlier Detection and Description Workshop, co-located with KDD 2021, Virtual
Weisfeiler and Lehman Go Topological: Message Passing Simplical Networks
A video presentation (and slides) by Cristian Bodnar & Fabrizio Frasca on a new type of GNNs that defines neighborhoods based on the simplical complexes of a graph. It goes quite deep into the theory with the supporting experiments in graph isomorphism, graph classification, and trajectory disambiguation.
A video presentation (and slides) by Cristian Bodnar & Fabrizio Frasca on a new type of GNNs that defines neighborhoods based on the simplical complexes of a graph. It goes quite deep into the theory with the supporting experiments in graph isomorphism, graph classification, and trajectory disambiguation.
Videos from CS224W
A legendary Stanford CS224W course on graph ML now releases videos on YouTube for 2021. Promised to be 2 lectures each week. Slides available on the site too (homeworks are still missing).
A legendary Stanford CS224W course on graph ML now releases videos on YouTube for 2021. Promised to be 2 lectures each week. Slides available on the site too (homeworks are still missing).
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 1.1 - Why Graphs
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Bu1w3n
Jure Leskovec
Computer Science, PhD
Graphs are a general language for describing and analyzing entities with relations/interactions.…
Jure Leskovec
Computer Science, PhD
Graphs are a general language for describing and analyzing entities with relations/interactions.…