Video: Graph Neural Networks - a perspective from the ground up
A beautiful video about GNNs aimed at CS undergrads that explains what message passing and node embeddings are and gives a link prediction example.
A beautiful video about GNNs aimed at CS undergrads that explains what message passing and node embeddings are and gives a link prediction example.
YouTube
Graph Neural Networks - a perspective from the ground up
What is a graph, why Graph Neural Networks (GNNs), and what is the underlying math?
Highly recommended videos that I watched many times while making this:
Petar Veličković's GNN video → https://youtu.be/8owQBFAHw7E
Michael Bronstein's Geometric Deep Learning…
Highly recommended videos that I watched many times while making this:
Petar Veličković's GNN video → https://youtu.be/8owQBFAHw7E
Michael Bronstein's Geometric Deep Learning…
Graph Drawing and Network Visualization 2021
A symposium on graph Drawing and network visualization is a nice niche conference on how to draw graphs efficiently and insightfully. This year it will be organized both online and offline (in Tübingen, Germany). Dates are: September 14-17, 2021. Accepted papers can be seen here.
A symposium on graph Drawing and network visualization is a nice niche conference on how to draw graphs efficiently and insightfully. This year it will be organized both online and offline (in Tübingen, Germany). Dates are: September 14-17, 2021. Accepted papers can be seen here.
algo.inf.uni-tuebingen.de
Graph Drawing 2021 - GD2021
The 29th International Symposium on Graph Drawing and Network Visualization will be held (hopefully) at Tübingen from September 15 to 17, 2021. A pre-conference PhD school is planned for September 13-14, 2021.
Fresh picks from ArXiv
This week on ArXiv: predictions of routing times, benchmarking architecture tricks, and drug repurposing study 💊
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
* Single Node Injection Attack against Graph Neural Networks CIKM 2021
* ETA Prediction with Graph Neural Networks in Google Maps CIKM 2021, with Petar Veličković
* Tree Decomposed Graph Neural Network CIKM 2021, with Tyler Derr
* DSKReG: Differentiable Sampling on Knowledge Graph for Recommendation with Relational GNN CIKM 2021
* Multiplex Graph Neural Network for Extractive Text Summarization EMNLP 2021
* Demystifying Drug Repurposing Domain Comprehension with Knowledge Graph Embedding IEEE BioCAS 2021
* DC-GNet: Deep Mesh Relation Capturing Graph Convolution Network for 3D Human Shape Reconstruction ACM MM'21
* Visualizing JIT Compiler Graphs GD 2021
GNNs
* Spatio-Temporal Graph Contrastive Learning
Benchmark
* Weisfeiler-Leman in the BAMBOO: Novel AMR Graph Metrics and a Benchmark for AMR Graph Similarity TACL 2021
* Bag of Tricks for Training Deeper Graph Neural Networks: A Comprehensive Benchmark Study
Math
* Smallest graphs with given automorphism group
This week on ArXiv: predictions of routing times, benchmarking architecture tricks, and drug repurposing study 💊
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
* Single Node Injection Attack against Graph Neural Networks CIKM 2021
* ETA Prediction with Graph Neural Networks in Google Maps CIKM 2021, with Petar Veličković
* Tree Decomposed Graph Neural Network CIKM 2021, with Tyler Derr
* DSKReG: Differentiable Sampling on Knowledge Graph for Recommendation with Relational GNN CIKM 2021
* Multiplex Graph Neural Network for Extractive Text Summarization EMNLP 2021
* Demystifying Drug Repurposing Domain Comprehension with Knowledge Graph Embedding IEEE BioCAS 2021
* DC-GNet: Deep Mesh Relation Capturing Graph Convolution Network for 3D Human Shape Reconstruction ACM MM'21
* Visualizing JIT Compiler Graphs GD 2021
GNNs
* Spatio-Temporal Graph Contrastive Learning
Benchmark
* Weisfeiler-Leman in the BAMBOO: Novel AMR Graph Metrics and a Benchmark for AMR Graph Similarity TACL 2021
* Bag of Tricks for Training Deeper Graph Neural Networks: A Comprehensive Benchmark Study
Math
* Smallest graphs with given automorphism group
GNN Tutorial & Graph Convolution Intuition @ Distill
Distill.pub is a great new resource aimed at re-defining a way we publish papers. Publications on Distill have rich visualizations and hands-on examples that you can tweak right in a browser. Unfortunately, Distill goes on a hiatus.
But, as the last bow, the authors prepared two very cool articles breaking down message passing and graph convolutions:
1. A Gentle Introduction to Graph Neural Networks
2. Understanding Convolutions on Graphs
Something you definitely do not want to miss in September!
Distill.pub is a great new resource aimed at re-defining a way we publish papers. Publications on Distill have rich visualizations and hands-on examples that you can tweak right in a browser. Unfortunately, Distill goes on a hiatus.
But, as the last bow, the authors prepared two very cool articles breaking down message passing and graph convolutions:
1. A Gentle Introduction to Graph Neural Networks
2. Understanding Convolutions on Graphs
Something you definitely do not want to miss in September!
Distill
A Gentle Introduction to Graph Neural Networks
What components are needed for building learning algorithms that leverage the structure and properties of graphs?
Monday Theory: Structural vs Positional Node Representations
In the new slide deck, Bruno Ribeiro (Purdue University) uncovers the nature of two commonly used mechanisms for building node representations. Structural representations are permutation insensitive (like GNNs) whereas positional representations are permutation sensitive (like SVD vectors). Hence, all GRL approaches can be broadly classified into those two families. Takeaway messages:
Message 1: Positional representations of k nodes are to most expressive k-node structural representations as samples of a distribution are to sufficient statistics of the distribution. This is based on the results published in the ICLR'20 paper
Message 2: As soon as you introduce some sort of node IDs you break equivariance but at the same time can predict properties of any subset of nodes (better link prediction). You’d better aggregate over multiple samples though (from the stats analogy). If you stick to equivariance, you can predict node or graph-level properties but nothing in-between.
In the new slide deck, Bruno Ribeiro (Purdue University) uncovers the nature of two commonly used mechanisms for building node representations. Structural representations are permutation insensitive (like GNNs) whereas positional representations are permutation sensitive (like SVD vectors). Hence, all GRL approaches can be broadly classified into those two families. Takeaway messages:
Message 1: Positional representations of k nodes are to most expressive k-node structural representations as samples of a distribution are to sufficient statistics of the distribution. This is based on the results published in the ICLR'20 paper
Message 2: As soon as you introduce some sort of node IDs you break equivariance but at the same time can predict properties of any subset of nodes (better link prediction). You’d better aggregate over multiple samples though (from the stats analogy). If you stick to equivariance, you can predict node or graph-level properties but nothing in-between.
Fresh picks from ArXiv
This week on ArXiv: improving robustness by resampling a graph, learning better scenes, and new homophily definitions 🐤
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Training Graph Neural Networks by Graphon Estimation
* Learning to Generate Scene Graph from Natural Language Supervision ICCV 2021
* Pointspectrum: Equivariance Meets Laplacian Filtering for Graph Representation Learning
* Sparsifying the Update Step in Graph Neural Networks
* Adaptive Label Smoothing To Regularize Large-Scale Graph Training
Math
* How likely is a random graph shift-enabled?
* The Popularity-Homophily Index: A new way to measure Homophily in Directed Graphs
This week on ArXiv: improving robustness by resampling a graph, learning better scenes, and new homophily definitions 🐤
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Training Graph Neural Networks by Graphon Estimation
* Learning to Generate Scene Graph from Natural Language Supervision ICCV 2021
* Pointspectrum: Equivariance Meets Laplacian Filtering for Graph Representation Learning
* Sparsifying the Update Step in Graph Neural Networks
* Adaptive Label Smoothing To Regularize Large-Scale Graph Training
Math
* How likely is a random graph shift-enabled?
* The Popularity-Homophily Index: A new way to measure Homophily in Directed Graphs
The Learning on Graphs and Geometry Reading Group
A reading group organized by Hannes Stärk with supervision from Pietro Liò at Cambridge. Includes really interesting fresh papers on graphs. Every Tuesday at 5pm CEST.
A reading group organized by Hannes Stärk with supervision from Pietro Liò at Cambridge. Includes really interesting fresh papers on graphs. Every Tuesday at 5pm CEST.
Researcher Positions at Dimitri's Ognibene's Lab
Two positions for post doc/researchers are available at Milano Bicocca University under Dimitri's Ognibene supervision. 2 years contract, based in Milan (possibility to remote working). For application contact: Dimitri Ognibene [email protected]. Description is below:
Do social media harm teenagers and our society?
Can we make them safer?
We will use the state of the art in graph neural networks, reinforcement learning, nlp, cv, and machine learning in general to improve our understanding of social media dynamics, and help our society by supporting and teaching young people tackle hate speech and fake news in social media.
Two positions for post doc/researchers are available at Milano Bicocca University under Dimitri's Ognibene supervision. 2 years contract, based in Milan (possibility to remote working). For application contact: Dimitri Ognibene [email protected]. Description is below:
Do social media harm teenagers and our society?
Can we make them safer?
We will use the state of the art in graph neural networks, reinforcement learning, nlp, cv, and machine learning in general to improve our understanding of social media dynamics, and help our society by supporting and teaching young people tackle hate speech and fake news in social media.
Google
Dimitri Ognibene's Homepage - Jobs
Currently Open Positions
Graph ML in Industry Workshop
When I wrote top applications of GNNs at the beginning of this year, I had a feeling that graph ML community is mature enough to start being used in industrial companies. Nine months ahead we decided to gather researchers, engineers, and industry professionals to talk about applications of graphs in the companies. Please, join us on 23rd Sept, 17-00 Paris time (free, online, ~3 hours) by registering at the link.
When I wrote top applications of GNNs at the beginning of this year, I had a feeling that graph ML community is mature enough to start being used in industrial companies. Nine months ahead we decided to gather researchers, engineers, and industry professionals to talk about applications of graphs in the companies. Please, join us on 23rd Sept, 17-00 Paris time (free, online, ~3 hours) by registering at the link.
Google
Graph Machine Learning in Industry
Criteo AI Lab is excited to be presenting Graph Machine Learning in Industry. Please join us on Thursday, September 23rd, at 17:00 Paris time. This page will be updated with video links after the workshop.
Review: Deep Learning on Sets
A new blog post by Fabian Fuchs and others about recent approaches of applying deep learning on sets. It digests several paradigms such as permuting & averaging, sorting, approximating invariance, and learning on graphs as a way to overcome permutation invariance of machine learning algorithms.
A new blog post by Fabian Fuchs and others about recent approaches of applying deep learning on sets. It digests several paradigms such as permuting & averaging, sorting, approximating invariance, and learning on graphs as a way to overcome permutation invariance of machine learning algorithms.
fabianfuchsml.github.io
Fabian Fuchs
# Review: Deep Learning on Sets [Fabian Fuchs](https://twitter.com/fabianfuchsml), [Ed Wagstaff](https://twitter.com/EdWagstaff), [Martin Engelcke](https://twitter.com/martinengelcke) ___ _In this blog, we analyse and categorise the different approaches in…
Organizational Update
I've been running this channel alone for almost two years but it's been more challenging recently to keep the previous pace. To help me, Michael Galkin generously accepted to be one of the admins of this channel, who has been already involved in several posts here.
Michael Galkin is a postdoc at Mila & McGill and you can know him by the amazing digests of knowledge graphs papers, contributions to the open-source projects, and strong research works. Please, welcome Michael and subscribe to his twitter.
P.S. Also I will use this opportunity to remind that if you have something to share with a graph community, do not hesitate to contact us.
I've been running this channel alone for almost two years but it's been more challenging recently to keep the previous pace. To help me, Michael Galkin generously accepted to be one of the admins of this channel, who has been already involved in several posts here.
Michael Galkin is a postdoc at Mila & McGill and you can know him by the amazing digests of knowledge graphs papers, contributions to the open-source projects, and strong research works. Please, welcome Michael and subscribe to his twitter.
P.S. Also I will use this opportunity to remind that if you have something to share with a graph community, do not hesitate to contact us.
Medium
Michael Galkin – Medium
Read writing from Michael Galkin on Medium. AI Research Scientist on Graph and Geometric learning.
PyG 2.0 (PyTorch Geometric 2.0) Release
One of the most prominent libraries in the world of GNNs and Geometric DL got a major update (and a small re-branding to a shorter "PyG")! Now with a website and Slack channel.
In addition to a constantly growing number of supported GNN architectures, the 2.0 version features:
1. Heterogeneous graph support with models, mini-batching, sampling, and a one-line conversion of homogeneous models to heterogeneous.
2. GraphGym - a whole platform for designing and experimenting with GNN architectures where you can fine-tune the nitty-gritty details of your model and find the best hyperparams. Based on the NeurIPS'20 paper
3. Pre-defined models - before, you'd usually build a GNN model from a collection of layers by yourself (trying to not forget to put that non-linearity after the GCN layer). Now, the library includes 25 well-known models!
4. Half-precision support and other smaller improvements to make your GNN journey easier.
One of the most prominent libraries in the world of GNNs and Geometric DL got a major update (and a small re-branding to a shorter "PyG")! Now with a website and Slack channel.
In addition to a constantly growing number of supported GNN architectures, the 2.0 version features:
1. Heterogeneous graph support with models, mini-batching, sampling, and a one-line conversion of homogeneous models to heterogeneous.
2. GraphGym - a whole platform for designing and experimenting with GNN architectures where you can fine-tune the nitty-gritty details of your model and find the best hyperparams. Based on the NeurIPS'20 paper
3. Pre-defined models - before, you'd usually build a GNN model from a collection of layers by yourself (trying to not forget to put that non-linearity after the GCN layer). Now, the library includes 25 well-known models!
4. Half-precision support and other smaller improvements to make your GNN journey easier.
GitHub
Release 2.0.0 · pyg-team/pytorch_geometric
PyG 2.0 🎉 🎉 🎉
PyG (PyTorch Geometric) has been moved from my own personal account rusty1s to its own organization account pyg-team to emphasize the ongoing collaboration between TU Dortmund Univers...
PyG (PyTorch Geometric) has been moved from my own personal account rusty1s to its own organization account pyg-team to emphasize the ongoing collaboration between TU Dortmund Univers...
Fresh picks from ArXiv
This week on ArXiv: GNN link with causal models, augmenting data, and using knowledge graphs with BERT 🧸
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Relating Graph Neural Networks to Structural Causal Models with Petar Veličković, Kristian Kersting
* A Study of Joint Graph Inference and Forecasting with Stephan Günnemann
* Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification?
* Local Augmentation for Graph Neural Networks
* Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT EMNLP 2021
Math
* There does not exist a strongly regular graph with parameters (1911,270,105,27)
* On the Fundamental Limits of Matrix Completion: Leveraging Hierarchical Similarity Graphs
This week on ArXiv: GNN link with causal models, augmenting data, and using knowledge graphs with BERT 🧸
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Relating Graph Neural Networks to Structural Causal Models with Petar Veličković, Kristian Kersting
* A Study of Joint Graph Inference and Forecasting with Stephan Günnemann
* Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification?
* Local Augmentation for Graph Neural Networks
* Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT EMNLP 2021
Math
* There does not exist a strongly regular graph with parameters (1911,270,105,27)
* On the Fundamental Limits of Matrix Completion: Leveraging Hierarchical Similarity Graphs
Stanford Graph Learning Workshop
A great online workshop will be organized by Stanford, on Thursday, Sept 16 2021, 08:00 - 17:00 Pacific Time. It includes talks from Jure Leskovec, Matthias Fey, Weihua Hu, Jiaxuan You, as well as a series of talks on applications of GNNs, and two industry panels.
A great online workshop will be organized by Stanford, on Thursday, Sept 16 2021, 08:00 - 17:00 Pacific Time. It includes talks from Jure Leskovec, Matthias Fey, Weihua Hu, Jiaxuan You, as well as a series of talks on applications of GNNs, and two industry panels.
Modeling Intelligence via Graph Neural Networks: slides
The slides of the thesis by Keyulu Xu: Modeling Intelligence via Graph Neural Networks. Keyulu is one of the authors of GIN and other notable works in GML.
The slides of the thesis by Keyulu Xu: Modeling Intelligence via Graph Neural Networks. Keyulu is one of the authors of GIN and other notable works in GML.
Geometric Deep Learning @ML Street Talk
Michael Bronstein, Petar Veličković, Taco Cohen and Joan Bruna are special guests in the new 3.5 hours (👀) episode of ML Street Talk talking Geometric DL and explaining the concepts covered in their recent book and pretty much all the current state of the art in the field. Available on YT as a video and as a podcast on all major platforms.
Michael Bronstein, Petar Veličković, Taco Cohen and Joan Bruna are special guests in the new 3.5 hours (👀) episode of ML Street Talk talking Geometric DL and explaining the concepts covered in their recent book and pretty much all the current state of the art in the field. Available on YT as a video and as a podcast on all major platforms.
YouTube
GEOMETRIC DEEP LEARNING BLUEPRINT
Patreon: https://www.patreon.com/mlst
Discord: https://discord.gg/ESrGqhf5CB
"Symmetry, as wide or narrow as you may define its meaning, is one idea by which man through the ages has tried to comprehend and create order, beauty, and perfection." and that…
Discord: https://discord.gg/ESrGqhf5CB
"Symmetry, as wide or narrow as you may define its meaning, is one idea by which man through the ages has tried to comprehend and create order, beauty, and perfection." and that…
Fresh picks from ArXiv
This week on ArXiv: demystifying performance of hyperbolic embeddings, complex question answering, and emotion chatbots 👧
If I forgot to mention your paper, please shoot me a message and I will update the post.
Knowledge graphs
* Benchmarking the Combinatorial Generalizability of Complex Query Answering on Knowledge Graphs
* Complex Temporal Question Answering on Knowledge Graphs
* Emily: Developing An Emotion-affective Open-Domain Chatbot with Knowledge Graph-based Persona
Benchmarking
* Comparing Euclidean and Hyperbolic Embeddings on the WordNet Nouns Hypernymy Graph
GNNs
* Releasing Graph Neural Networks with Differential Privacy Guarantees
This week on ArXiv: demystifying performance of hyperbolic embeddings, complex question answering, and emotion chatbots 👧
If I forgot to mention your paper, please shoot me a message and I will update the post.
Knowledge graphs
* Benchmarking the Combinatorial Generalizability of Complex Query Answering on Knowledge Graphs
* Complex Temporal Question Answering on Knowledge Graphs
* Emily: Developing An Emotion-affective Open-Domain Chatbot with Knowledge Graph-based Persona
Benchmarking
* Comparing Euclidean and Hyperbolic Embeddings on the WordNet Nouns Hypernymy Graph
GNNs
* Releasing Graph Neural Networks with Differential Privacy Guarantees
OGB Large-Scale Challenge Workshop - Presentations of the Winners
OGB LSC is a KDD'21 challenge organized by the OGB team and known for the largest-to-date benchmarking datasets in node-level (240M nodes / 1.7B edges), link-level (90M nodes, 500M edges), and graph-level (4M molecules) tasks. Surely, not all academic labs can afford such compute, but the more interesting are the approaches taken by the winners! Are there any smart tricks or merely "more layers - more ensembles - GPUs go brrr"?
Finally, the recordings of the LSC workshop are available! (~3 hours long, so the Graph ML channel editors assume you've already successfully digested the ML Street Talk for breakfast)
The 2nd day of the workshop features (videos are available):
- Invited talks by Viktor Prasanna (USC), Marinka Zitnik (Harvard), and Larry Zitnick (Facebook AI)
- Panel discussion on the future of Graph ML with Yizhou Sun (UCLA), Zheng Zhang (NYU / Amazon), Shuiwang Ji (Texas A&M), and Jian Tang (MILA)
OGB LSC is a KDD'21 challenge organized by the OGB team and known for the largest-to-date benchmarking datasets in node-level (240M nodes / 1.7B edges), link-level (90M nodes, 500M edges), and graph-level (4M molecules) tasks. Surely, not all academic labs can afford such compute, but the more interesting are the approaches taken by the winners! Are there any smart tricks or merely "more layers - more ensembles - GPUs go brrr"?
Finally, the recordings of the LSC workshop are available! (~3 hours long, so the Graph ML channel editors assume you've already successfully digested the ML Street Talk for breakfast)
The 2nd day of the workshop features (videos are available):
- Invited talks by Viktor Prasanna (USC), Marinka Zitnik (Harvard), and Larry Zitnick (Facebook AI)
- Panel discussion on the future of Graph ML with Yizhou Sun (UCLA), Zheng Zhang (NYU / Amazon), Shuiwang Ji (Texas A&M), and Jian Tang (MILA)
Open Graph Benchmark
OGB-LSC @ KDD Cup 2021
Learn about the workshop schedule All the sessions are live talks over Zoom. You need to register for the KDD conference in order to join the event.
GML Express: Graph ML in Industry Workshop, Geometric Deep Learning, and New Software.
In case you missed recent most popular events in graph ML, here is a fresh newsletter with recent videos, courses, books, trends, and future events.
In case you missed recent most popular events in graph ML, here is a fresh newsletter with recent videos, courses, books, trends, and future events.
Graph Machine Learning
GML Express: Graph ML in Industry Workshop, Geometric Deep Learning, and New Software.
"The real voyage of discovery consists not in seeking new lands but seeing with new eyes." Marcel Proust
Graph Machine Learning in Industry workshop live
Our workshop starts in one hour and I'm excited about our speakers and talks that are ahead (something I would like to attend even if I didn't organize it). You can join us on YouTube or Zoom and we encourage you to ask questions.
The topics are:
0. Me (17:00 Paris time): opening remarks
1. James Zhang (AWS) (17:15): Challenges and Thinking in Go-production of GNN + DGL.
2. Charles Tapley Hoyt (Harvard) (17:45): Current Issues in Theory, Reproducibility, and Utility of Graph Machine Learning in the Life Sciences.
3. Anton Tsitsulin (Google) (18:15): Graph Learning for Billion Scale Graphs.
4. Cheng Ye (AstraZeneca) (19:00): Predicting Potential Drug Targets Using Tensor Factorisation and Knowledge Graph Embeddings.
5: Rocío Mercado (MIT) (19:30): Accelerating Molecular Design Using Graph-Based Deep Generative Models.
6. Lingfei Wu (JD.com) (20:00): Deep Learning On Graphs for Natural Language Processing.
Our workshop starts in one hour and I'm excited about our speakers and talks that are ahead (something I would like to attend even if I didn't organize it). You can join us on YouTube or Zoom and we encourage you to ask questions.
The topics are:
0. Me (17:00 Paris time): opening remarks
1. James Zhang (AWS) (17:15): Challenges and Thinking in Go-production of GNN + DGL.
2. Charles Tapley Hoyt (Harvard) (17:45): Current Issues in Theory, Reproducibility, and Utility of Graph Machine Learning in the Life Sciences.
3. Anton Tsitsulin (Google) (18:15): Graph Learning for Billion Scale Graphs.
4. Cheng Ye (AstraZeneca) (19:00): Predicting Potential Drug Targets Using Tensor Factorisation and Knowledge Graph Embeddings.
5: Rocío Mercado (MIT) (19:30): Accelerating Molecular Design Using Graph-Based Deep Generative Models.
6. Lingfei Wu (JD.com) (20:00): Deep Learning On Graphs for Natural Language Processing.
Kite: An interactive visualization tool for graph theory
Another tool called Kite to draw simple graphs and run some graph algorithms.
Another tool called Kite to draw simple graphs and run some graph algorithms.
GitHub
GitHub - erkal/kite: An interactive visualization tool for graph theory
An interactive visualization tool for graph theory - erkal/kite