ICLR 2021 stats
Dates: May 4-8
Where: Online
All papers can be found here. Graph papers can be found here.
• 2997 submissions (vs 2594 in 2020)
• 860 accepted (vs 687 in 2020)
• 29% acceptance rate (vs 26.5% in 2020)
• 50 graph papers (6% of total)
Dates: May 4-8
Where: Online
All papers can be found here. Graph papers can be found here.
• 2997 submissions (vs 2594 in 2020)
• 860 accepted (vs 687 in 2020)
• 29% acceptance rate (vs 26.5% in 2020)
• 50 graph papers (6% of total)
OpenReview
ICLR 2021 Conference
Welcome to the OpenReview homepage for ICLR 2021 Conference
Boost then Convolve: Gradient Boosting Meets Graph Neural Networks
In our new work at ICLR 2021, we explore how to apply Gradient Boosted Decision Trees to graphs. Surprisingly, I haven't encountered before papers that test performance of pure GBDT on graphs, for example for node classification.
GBDTs are usually used for heterogeneous data (e.g. in Kaggle competitions): the columns can be categorical, of different scale and meaning (e.g. income column vs age column). Such data is quite common in the real world, but most of the research graph datasets have sparse homogeneous nodes features (e.g. bag-of-words features or word embeddings). So we asked a question whether GNNs are efficient on graphs with heterogeneous features.
The first insight is that you can just pretrain GBDT on the node features and use the predictions of GBDT for training GNN model. This already gives a boost to GNN model.
Second, we proposed a scheme how to train GBDT and GNN end-to-end, and this would additionally boost performance.
Third, this combo of GBDT and GNN, which we call BGNN, converges much faster than GNN and therefore usually is faster to train than pure GNN.
Some limitations.
* BGNN works well with heterogeneous features. So Cora datasets and others with homogeneous features are still better of with plain GNN.
* The approach works for node regression and classification. We have some ideas how to extend it to link prediction or graph classification, but haven't worked it out yet. If you have some interest in continuing this line of work, let me know.
The code and datasets are available here.
In our new work at ICLR 2021, we explore how to apply Gradient Boosted Decision Trees to graphs. Surprisingly, I haven't encountered before papers that test performance of pure GBDT on graphs, for example for node classification.
GBDTs are usually used for heterogeneous data (e.g. in Kaggle competitions): the columns can be categorical, of different scale and meaning (e.g. income column vs age column). Such data is quite common in the real world, but most of the research graph datasets have sparse homogeneous nodes features (e.g. bag-of-words features or word embeddings). So we asked a question whether GNNs are efficient on graphs with heterogeneous features.
The first insight is that you can just pretrain GBDT on the node features and use the predictions of GBDT for training GNN model. This already gives a boost to GNN model.
Second, we proposed a scheme how to train GBDT and GNN end-to-end, and this would additionally boost performance.
Third, this combo of GBDT and GNN, which we call BGNN, converges much faster than GNN and therefore usually is faster to train than pure GNN.
Some limitations.
* BGNN works well with heterogeneous features. So Cora datasets and others with homogeneous features are still better of with plain GNN.
* The approach works for node regression and classification. We have some ideas how to extend it to link prediction or graph classification, but haven't worked it out yet. If you have some interest in continuing this line of work, let me know.
The code and datasets are available here.
openreview.net
Boost then Convolve: Gradient Boosting Meets Graph Neural Networks
Graph neural networks (GNNs) are powerful models that have been successful in various graph representation learning tasks. Whereas gradient boosted decision trees (GBDT) often outperform other...
Graph Machine Learning research groups: Stefanie Jegelka
I do a series of posts on the groups in graph research, previous post is here. The 22nd is Stefanie Jegelka, a professor at MIT working on submodular functions, DPP, and more recently on theoretical aspects of GNNs.
Stefanie Jegelka (~1986)
- Affiliation: MIT
- Education: Ph.D. at Max Planck Institute for Intelligent Systems, Tubingen and ETH Zurich in 2012 (advisors: Jeff Bilmes, Bernhard Scholkopf, Andreas Krause)
- h-index 33
- Awards: Joseph A Martore Award, NSF CAREER Award, best papers at ICML, NeurIPS
- Interests: generalization and expressivity of GNNs, clustering and graph partitioning
I do a series of posts on the groups in graph research, previous post is here. The 22nd is Stefanie Jegelka, a professor at MIT working on submodular functions, DPP, and more recently on theoretical aspects of GNNs.
Stefanie Jegelka (~1986)
- Affiliation: MIT
- Education: Ph.D. at Max Planck Institute for Intelligent Systems, Tubingen and ETH Zurich in 2012 (advisors: Jeff Bilmes, Bernhard Scholkopf, Andreas Krause)
- h-index 33
- Awards: Joseph A Martore Award, NSF CAREER Award, best papers at ICML, NeurIPS
- Interests: generalization and expressivity of GNNs, clustering and graph partitioning
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Michele Coscia
I do a series of posts on the groups in graph research, previous post is here. The 21st is Michele Coscia, the author of the atlas of the network science.
Michele Coscia (~1985)
- Affiliation: IT…
I do a series of posts on the groups in graph research, previous post is here. The 21st is Michele Coscia, the author of the atlas of the network science.
Michele Coscia (~1985)
- Affiliation: IT…
PhD position in Graph Neural Networks Modelling
Norwegian University of Science and Technology opened a PhD position for the thesis topic Interpretable Models with Graph Neural Networks to support the Green Transition of Critical Infrastructures. Deadline is 1 Feb 2021. 3-year contract, ~500K NOK per year before tax.
Norwegian University of Science and Technology opened a PhD position for the thesis topic Interpretable Models with Graph Neural Networks to support the Green Transition of Critical Infrastructures. Deadline is 1 Feb 2021. 3-year contract, ~500K NOK per year before tax.
Jobbnorge.no
PhD position in Graph Neural Networks Modelling (198730) | NTNU - Norwegian University of Science and Technology
Job title: PhD position in Graph Neural Networks Modelling (198730), Employer: NTNU - Norwegian University of Science and Technology, Deadline: Closed
Fresh picks from ArXiv
This week on ArXiv: scaling GNNs to billions of edges, connection between label propagation and message passing, and GNNs for DAGs 🗡
Scalability
* Learning Massive Graph Embeddings on a Single Machine
* PyTorch-Direct: Enabling GPU Centric Data Access for Very Large Graph Neural Network Training with Irregular Accesses
Graph models
* Generating a Doppelganger Graph: Resembling but Distinct
* A Generalized Weisfeiler-Lehman Graph Kernel
* A Unifying Generative Model for Graph Learning Algorithms: Label Propagation, Graph Convolutions, and Combinations with Austin R. Benson
* Hypergraph clustering: from blockmodels to modularity with Austin R. Benson
ICLR 2021
* Boost then Convolve: Gradient Boosting Meets Graph Neural Networks with me
* Directed Acyclic Graph Neural Networks
WWW 2021
* How Do Hyperedges Overlap in Real-World Hypergraphs? -- Patterns, Measures, and Generators
Survey
* A Review of Graph Neural Networks and Their Applications in Power Systems
This week on ArXiv: scaling GNNs to billions of edges, connection between label propagation and message passing, and GNNs for DAGs 🗡
Scalability
* Learning Massive Graph Embeddings on a Single Machine
* PyTorch-Direct: Enabling GPU Centric Data Access for Very Large Graph Neural Network Training with Irregular Accesses
Graph models
* Generating a Doppelganger Graph: Resembling but Distinct
* A Generalized Weisfeiler-Lehman Graph Kernel
* A Unifying Generative Model for Graph Learning Algorithms: Label Propagation, Graph Convolutions, and Combinations with Austin R. Benson
* Hypergraph clustering: from blockmodels to modularity with Austin R. Benson
ICLR 2021
* Boost then Convolve: Gradient Boosting Meets Graph Neural Networks with me
* Directed Acyclic Graph Neural Networks
WWW 2021
* How Do Hyperedges Overlap in Real-World Hypergraphs? -- Patterns, Measures, and Generators
Survey
* A Review of Graph Neural Networks and Their Applications in Power Systems
Course: ODS Knowledge Graphs
Michael Galkin starts a self-paced course on knowledge graphs. For now, it's only in Russian, with the plan to make it in English after the first iteration. The first introduction lecture is available on YouTube. You can join discussion group for all your questions and proposals: @kg_course. The first lecture starts this Thursday, more in the channel @kg_course.
Course curriculum:
* Knowledge representations (RDF, RDFS, OWL)
* Storage and queries (SPARQL, Graph DBs)
* Consistency (RDF*, SHACL, ShEx)
* Semantic Data Integration
* Graph theory intro
* KG embeddings
* GNNs for KGs
* Applications: Question Answering, Query Embeddings
Michael Galkin starts a self-paced course on knowledge graphs. For now, it's only in Russian, with the plan to make it in English after the first iteration. The first introduction lecture is available on YouTube. You can join discussion group for all your questions and proposals: @kg_course. The first lecture starts this Thursday, more in the channel @kg_course.
Course curriculum:
* Knowledge representations (RDF, RDFS, OWL)
* Storage and queries (SPARQL, Graph DBs)
* Consistency (RDF*, SHACL, ShEx)
* Semantic Data Integration
* Graph theory intro
* KG embeddings
* GNNs for KGs
* Applications: Question Answering, Query Embeddings
YouTube
Михаил Галкин - Анонс курса Knowledge Graphs
Михаил и его коллеги подготовили курс по графам знаний ( Knowledge Graphs ) https://ods.ai/tracks/kgcourse2021
Будет дан краткий анонс курса и ответы на вопросы участников семинара.
Graph Representation Learning (GRL) - одна из самых быстро растущих тем…
Будет дан краткий анонс курса и ответы на вопросы участников семинара.
Graph Representation Learning (GRL) - одна из самых быстро растущих тем…
GNN User Group events
First event at GNN User Group organized by DGL team (Amazon) and CuGraph team (Nvidia) starts tomorrow. Events should be organized monthly. The first talk is "A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech)" + some networking event.
First event at GNN User Group organized by DGL team (Amazon) and CuGraph team (Nvidia) starts tomorrow. Events should be organized monthly. The first talk is "A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech)" + some networking event.
Eventbrite
Graph Neural Networks User Group
RoboGrammar: Graph Grammar for Terrain-Optimized Robot Design
(video) A recent work done at MIT for constructing different robot designs via graph grammar. Graph grammars were introduced in 1992 and defines a set of rules of transforming one graph to another. With this, a user can specify input robot components as well as the type of the terrain and graph grammar will produce possible robot designs. Next, a variation of A* algorithm is used to search for the optimal robot design for a given terrain. More on this in this article.
(video) A recent work done at MIT for constructing different robot designs via graph grammar. Graph grammars were introduced in 1992 and defines a set of rules of transforming one graph to another. With this, a user can specify input robot components as well as the type of the terrain and graph grammar will produce possible robot designs. Next, a variation of A* algorithm is used to search for the optimal robot design for a given terrain. More on this in this article.
YouTube
RoboGrammar: Graph Grammar for Terrain-Optimized Robot Design
ACM SIGGRAPH Asia 2020
https://cdfg.mit.edu/publications/robogrammar-graph-grammar-for-terrain-optimized-robot-design
https://cdfg.mit.edu/publications/robogrammar-graph-grammar-for-terrain-optimized-robot-design
CS224W: Machine Learning with Graphs 2021
CS224W is one of the most popular graph courses by Jure Leskovec at Stanford. This year includes extra topics such as label propagation, scalability of GNNs, and graph nets for science and biology. The slides for the first 6 out of 20 lectures are available.
CS224W is one of the most popular graph courses by Jure Leskovec at Stanford. This year includes extra topics such as label propagation, scalability of GNNs, and graph nets for science and biology. The slides for the first 6 out of 20 lectures are available.
Video: GNN User Group
The video from the first meeting of GNN user group talks about the usage and next release of DGL and featuring Le Song with combinatorial optimization talk.
The video from the first meeting of GNN user group talks about the usage and next release of DGL and featuring Le Song with combinatorial optimization talk.
YouTube
Graph Neural Networks User Group Meeting on Jan 28, 2021
Welcome to the first user group for Deep Graph Neural Networks!
We look forward to building community and sharing our interest in GNNs. While this is hosted by AWS and NVIDIA, all are welcome!
Learning about graphs has emerged as one of the hottest area…
We look forward to building community and sharing our interest in GNNs. While this is hosted by AWS and NVIDIA, all are welcome!
Learning about graphs has emerged as one of the hottest area…
GML Newsletter: Interpolation and Extrapolation of Graph Neural Networks
The new issue of the newsletter is about generalization of GNNs. Compared to the study of expressive power, there are fewer works about generalization. Nonetheless, I gathered the most exciting research I found on this topic, which I hope will familiarize you with this research direction.
The new issue of the newsletter is about generalization of GNNs. Compared to the study of expressive power, there are fewer works about generalization. Nonetheless, I gathered the most exciting research I found on this topic, which I hope will familiarize you with this research direction.
Substack
GML Newsletter: Interpolation and Extrapolation of Graph Neural Networks
A trend is a trend is a trend, But the question is, will it bend? Will it alter its course Through some unforeseen force And come to a premature end? -- Alexander Cairncross
Fresh picks from ArXiv
This week on ArXiv: tensorflow GNN library, survey on graph-based kNN search, and automation of peer review? 🧐
Conferences
Interpreting and Unifying Graph Neural Networks with An Optimization Framework WWW 2021
A Graph-based Relevance Matching Model for Ad-hoc Retrieval AAAI 2021
Software
Efficient Graph Deep Learning in TensorFlow with tf_geometric
Survey
* A Comprehensive Survey and Experimental Comparison of Graph-Based Approximate Nearest Neighbor Search
* Graph Neural Network for Traffic Forecasting: A Survey
* Can We Automate Scientific Reviewing?
This week on ArXiv: tensorflow GNN library, survey on graph-based kNN search, and automation of peer review? 🧐
Conferences
Interpreting and Unifying Graph Neural Networks with An Optimization Framework WWW 2021
A Graph-based Relevance Matching Model for Ad-hoc Retrieval AAAI 2021
Software
Efficient Graph Deep Learning in TensorFlow with tf_geometric
Survey
* A Comprehensive Survey and Experimental Comparison of Graph-Based Approximate Nearest Neighbor Search
* Graph Neural Network for Traffic Forecasting: A Survey
* Can We Automate Scientific Reviewing?
How many paths of length k exist in a graph?
In case you are preparing for the next interview, here is a nice post describing several solutions to a common interview problem: count the number of possible walks between two points in a graph. The problem is not as easy as it seems.
In case you are preparing for the next interview, here is a nice post describing several solutions to a common interview problem: count the number of possible walks between two points in a graph. The problem is not as easy as it seems.
Tutorial: Graph Neural Networks: Models and Applications
A new tutorial covering robustness, attacks, scalability and self-supervised learning for GNN models at AAAI 2021. Slides and video are available.
A new tutorial covering robustness, attacks, scalability and self-supervised learning for GNN models at AAAI 2021. Slides and video are available.
Sberloga Talk
In case you speak Russian I will be presenting today our ICLR 2021 work about combination of GBDT with GNN on graphs with tabular features. The talk will be 19-00 MSK time. Zoom link will be shared soon at @sberlogawithgraphs. For more videos from Sberloga, subscribe here: https://www.youtube.com/c/SBERLOGA
In case you speak Russian I will be presenting today our ICLR 2021 work about combination of GBDT with GNN on graphs with tabular features. The talk will be 19-00 MSK time. Zoom link will be shared soon at @sberlogawithgraphs. For more videos from Sberloga, subscribe here: https://www.youtube.com/c/SBERLOGA
Cleora Paper
I already wrote about Cleora, an unsupervised embedding library, now there is a paper explaining details of it. The algorithm is just some form of matrix multiplication, yet it shows better performance for link prediction metrics and running time than Pytorch-BigGraph, DeepWalk and others.
I already wrote about Cleora, an unsupervised embedding library, now there is a paper explaining details of it. The algorithm is just some form of matrix multiplication, yet it shows better performance for link prediction metrics and running time than Pytorch-BigGraph, DeepWalk and others.
Telegram
Graph Machine Learning
Cleora: new unsupervised graph embedding model for hypergraphs
A new library Cleora, written in Rust (for efficiency), by Synerise, a startup building AI platform, builds graph embeddings in unsupervised, inductive, and scalable manner. The algorithm itself…
A new library Cleora, written in Rust (for efficiency), by Synerise, a startup building AI platform, builds graph embeddings in unsupervised, inductive, and scalable manner. The algorithm itself…
Deep Learning on Graphs: Method and Applications (DLG-AAAI’21)
AAAI workshop on GNNs with a amazing list of speakers including Jure Leskovec, Max Welling, Xavier Bresson, and many others. Zoom link is available here, starting today 5pm Europe time.
AAAI workshop on GNNs with a amazing list of speakers including Jure Leskovec, Max Welling, Xavier Bresson, and many others. Zoom link is available here, starting today 5pm Europe time.
Zoom Video
Join our Cloud HD Video Meeting
Zoom is the leader in modern enterprise video communications, with an easy, reliable cloud platform for video and audio conferencing, chat, and webinars across mobile, desktop, and room systems. Zoom Rooms is the original software-based conference room solution…
Fresh picks from ArXiv
This week on ArXiv: link prediction in KGs, unsupervised embedding library, and reconstruction conjecture for up to 13 vertices 💡
Conferences
* Exploring the Subgraph Density-Size Trade-off via the Lovász Extension WSDM 2021
* Effective and Scalable Clustering on Massive Attributed Graphs WebConf 2021
GNN
* Enhance Information Propagation for Graph Neural Network by Heterogeneous Aggregations
* CF-GNNExplainer: Counterfactual Explanations for Graph Neural Networks
Applications
* Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs with William L. Hamilton
* GNN-RL Compression: Topology-Aware Network Pruning using Multi-stage Graph Embedding and Reinforcement Learning
Software
* Cleora: A Simple, Strong and Scalable Graph Embedding Scheme
Math
* Reconstruction of small graphs and tournaments
This week on ArXiv: link prediction in KGs, unsupervised embedding library, and reconstruction conjecture for up to 13 vertices 💡
Conferences
* Exploring the Subgraph Density-Size Trade-off via the Lovász Extension WSDM 2021
* Effective and Scalable Clustering on Massive Attributed Graphs WebConf 2021
GNN
* Enhance Information Propagation for Graph Neural Network by Heterogeneous Aggregations
* CF-GNNExplainer: Counterfactual Explanations for Graph Neural Networks
Applications
* Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs with William L. Hamilton
* GNN-RL Compression: Topology-Aware Network Pruning using Multi-stage Graph Embedding and Reinforcement Learning
Software
* Cleora: A Simple, Strong and Scalable Graph Embedding Scheme
Math
* Reconstruction of small graphs and tournaments
How to get started with Graph Machine Learning
In a new post, Aleksa Gordić talks in depth about graph ML, its applications and shares useful resources to get you started in this world.
In a new post, Aleksa Gordić talks in depth about graph ML, its applications and shares useful resources to get you started in this world.
Medium
How to get started with Graph Machine Learning
Deep learning update: What have I learned about Graph ML in 2 months?