Podcast: Twiml with Michael Bronstein and Taco Cohen
There are two recent podcasts on Twiml. One with Taco Cohen, a researcher at Qualcomm, on their NeurIPS 20 work with Max Welling and Pim de Haan, called Natural Graph Networks.
The second is with Michael Bronstein, who looks back at the ML achievements of 2020 such as GPT-3 and Neural Implicit Representations. He also discusses the landscape of the Graph ML field in 2021.
There are two recent podcasts on Twiml. One with Taco Cohen, a researcher at Qualcomm, on their NeurIPS 20 work with Max Welling and Pim de Haan, called Natural Graph Networks.
The second is with Michael Bronstein, who looks back at the ML achievements of 2020 such as GPT-3 and Neural Implicit Representations. He also discusses the landscape of the Graph ML field in 2021.
TWIML
Natural Graph Networks with Taco Cohen | The TWIML AI Podcast
Post: AlphaFold 2 & Equivariance
"AlphaFold 2 probably used some network/algorithm to map graph features to obtain the initial XYZ coordinates. Later in the pipeline, they improved their initial prediction by iteratively running their structure module."
Scrutiny of the AlphaFold 2 inner workings by Justas Dauparas & Fabian Fuchs.
"AlphaFold 2 probably used some network/algorithm to map graph features to obtain the initial XYZ coordinates. Later in the pipeline, they improved their initial prediction by iteratively running their structure module."
Scrutiny of the AlphaFold 2 inner workings by Justas Dauparas & Fabian Fuchs.
fabianfuchsml.github.io
Fabian Fuchs
# AlphaFold 2 & Equivariance 17 December 2020 [Justas Dauparas](https://twitter.com/JustasDauparas) & [Fabian Fuchs](https://twitter.com/fabianfuchsml) A few weeks ago, in the latest CASP competition for protein structure prediction ([CASP14](https://pre…
GML Newsletter: Do we do it right?
My new issue of the Graph ML newsletter: looking back and ahead for the field. This time I want to raise a point that with all the great research we have in GML, we have comparatively fewer applications of it in real world and that's maybe up to us to pitch and use these developments for the good of people.
Also I moved the platform to substack and there is a wonderful button to support my writings altogether. When I moved to substack I actually just considered eliminating the costs of the previous platform, but surprisingly a few people became paying customers right from the start (which pleasantly surprised me of course). There are some perks of being a paying subscriber (no T-shirts yet:), but my plan is to continue to write for everybody, so hopefully win-win for me and the readers.
My new issue of the Graph ML newsletter: looking back and ahead for the field. This time I want to raise a point that with all the great research we have in GML, we have comparatively fewer applications of it in real world and that's maybe up to us to pitch and use these developments for the good of people.
Also I moved the platform to substack and there is a wonderful button to support my writings altogether. When I moved to substack I actually just considered eliminating the costs of the previous platform, but surprisingly a few people became paying customers right from the start (which pleasantly surprised me of course). There are some perks of being a paying subscriber (no T-shirts yet:), but my plan is to continue to write for everybody, so hopefully win-win for me and the readers.
Substack
GML Newsletter: Do we do it right?
"Just because someone else does the wrong thing we are not exempt from doing what’s right.”
Fresh picks from ArXiv
Today at ArXiv: GNNs for recommendations, scene graphs for VQA and survey on hyperbolic neural nets 👨🎓
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
* TextGNN: Improving Text Encoder via Graph Neural Network in Sponsored Search WebConf 2021
* ComQA:Compositional Question Answering via Hierarchical Graph Neural Networks WebConf 2021
* Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation WebConf 2021
* Discrete Knowledge Graph Embedding based on Discrete Optimization AAAI-20 Workshop
GNNs
* LaneRCNN: Distributed Representations for Graph-Centric Motion Forecasting with Raquel Urtasun
* Understanding the Role of Scene Graphs in Visual Question Answering
Surveys
* Hyperbolic Deep Neural Networks: A Survey
* Reinforcement learning based recommender systems: A survey
Today at ArXiv: GNNs for recommendations, scene graphs for VQA and survey on hyperbolic neural nets 👨🎓
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
* TextGNN: Improving Text Encoder via Graph Neural Network in Sponsored Search WebConf 2021
* ComQA:Compositional Question Answering via Hierarchical Graph Neural Networks WebConf 2021
* Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation WebConf 2021
* Discrete Knowledge Graph Embedding based on Discrete Optimization AAAI-20 Workshop
GNNs
* LaneRCNN: Distributed Representations for Graph-Centric Motion Forecasting with Raquel Urtasun
* Understanding the Role of Scene Graphs in Visual Question Answering
Surveys
* Hyperbolic Deep Neural Networks: A Survey
* Reinforcement learning based recommender systems: A survey
ICLR 2021 stats
Dates: May 4-8
Where: Online
All papers can be found here. Graph papers can be found here.
• 2997 submissions (vs 2594 in 2020)
• 860 accepted (vs 687 in 2020)
• 29% acceptance rate (vs 26.5% in 2020)
• 50 graph papers (6% of total)
Dates: May 4-8
Where: Online
All papers can be found here. Graph papers can be found here.
• 2997 submissions (vs 2594 in 2020)
• 860 accepted (vs 687 in 2020)
• 29% acceptance rate (vs 26.5% in 2020)
• 50 graph papers (6% of total)
OpenReview
ICLR 2021 Conference
Welcome to the OpenReview homepage for ICLR 2021 Conference
Boost then Convolve: Gradient Boosting Meets Graph Neural Networks
In our new work at ICLR 2021, we explore how to apply Gradient Boosted Decision Trees to graphs. Surprisingly, I haven't encountered before papers that test performance of pure GBDT on graphs, for example for node classification.
GBDTs are usually used for heterogeneous data (e.g. in Kaggle competitions): the columns can be categorical, of different scale and meaning (e.g. income column vs age column). Such data is quite common in the real world, but most of the research graph datasets have sparse homogeneous nodes features (e.g. bag-of-words features or word embeddings). So we asked a question whether GNNs are efficient on graphs with heterogeneous features.
The first insight is that you can just pretrain GBDT on the node features and use the predictions of GBDT for training GNN model. This already gives a boost to GNN model.
Second, we proposed a scheme how to train GBDT and GNN end-to-end, and this would additionally boost performance.
Third, this combo of GBDT and GNN, which we call BGNN, converges much faster than GNN and therefore usually is faster to train than pure GNN.
Some limitations.
* BGNN works well with heterogeneous features. So Cora datasets and others with homogeneous features are still better of with plain GNN.
* The approach works for node regression and classification. We have some ideas how to extend it to link prediction or graph classification, but haven't worked it out yet. If you have some interest in continuing this line of work, let me know.
The code and datasets are available here.
In our new work at ICLR 2021, we explore how to apply Gradient Boosted Decision Trees to graphs. Surprisingly, I haven't encountered before papers that test performance of pure GBDT on graphs, for example for node classification.
GBDTs are usually used for heterogeneous data (e.g. in Kaggle competitions): the columns can be categorical, of different scale and meaning (e.g. income column vs age column). Such data is quite common in the real world, but most of the research graph datasets have sparse homogeneous nodes features (e.g. bag-of-words features or word embeddings). So we asked a question whether GNNs are efficient on graphs with heterogeneous features.
The first insight is that you can just pretrain GBDT on the node features and use the predictions of GBDT for training GNN model. This already gives a boost to GNN model.
Second, we proposed a scheme how to train GBDT and GNN end-to-end, and this would additionally boost performance.
Third, this combo of GBDT and GNN, which we call BGNN, converges much faster than GNN and therefore usually is faster to train than pure GNN.
Some limitations.
* BGNN works well with heterogeneous features. So Cora datasets and others with homogeneous features are still better of with plain GNN.
* The approach works for node regression and classification. We have some ideas how to extend it to link prediction or graph classification, but haven't worked it out yet. If you have some interest in continuing this line of work, let me know.
The code and datasets are available here.
openreview.net
Boost then Convolve: Gradient Boosting Meets Graph Neural Networks
Graph neural networks (GNNs) are powerful models that have been successful in various graph representation learning tasks. Whereas gradient boosted decision trees (GBDT) often outperform other...
Graph Machine Learning research groups: Stefanie Jegelka
I do a series of posts on the groups in graph research, previous post is here. The 22nd is Stefanie Jegelka, a professor at MIT working on submodular functions, DPP, and more recently on theoretical aspects of GNNs.
Stefanie Jegelka (~1986)
- Affiliation: MIT
- Education: Ph.D. at Max Planck Institute for Intelligent Systems, Tubingen and ETH Zurich in 2012 (advisors: Jeff Bilmes, Bernhard Scholkopf, Andreas Krause)
- h-index 33
- Awards: Joseph A Martore Award, NSF CAREER Award, best papers at ICML, NeurIPS
- Interests: generalization and expressivity of GNNs, clustering and graph partitioning
I do a series of posts on the groups in graph research, previous post is here. The 22nd is Stefanie Jegelka, a professor at MIT working on submodular functions, DPP, and more recently on theoretical aspects of GNNs.
Stefanie Jegelka (~1986)
- Affiliation: MIT
- Education: Ph.D. at Max Planck Institute for Intelligent Systems, Tubingen and ETH Zurich in 2012 (advisors: Jeff Bilmes, Bernhard Scholkopf, Andreas Krause)
- h-index 33
- Awards: Joseph A Martore Award, NSF CAREER Award, best papers at ICML, NeurIPS
- Interests: generalization and expressivity of GNNs, clustering and graph partitioning
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Michele Coscia
I do a series of posts on the groups in graph research, previous post is here. The 21st is Michele Coscia, the author of the atlas of the network science.
Michele Coscia (~1985)
- Affiliation: IT…
I do a series of posts on the groups in graph research, previous post is here. The 21st is Michele Coscia, the author of the atlas of the network science.
Michele Coscia (~1985)
- Affiliation: IT…
PhD position in Graph Neural Networks Modelling
Norwegian University of Science and Technology opened a PhD position for the thesis topic Interpretable Models with Graph Neural Networks to support the Green Transition of Critical Infrastructures. Deadline is 1 Feb 2021. 3-year contract, ~500K NOK per year before tax.
Norwegian University of Science and Technology opened a PhD position for the thesis topic Interpretable Models with Graph Neural Networks to support the Green Transition of Critical Infrastructures. Deadline is 1 Feb 2021. 3-year contract, ~500K NOK per year before tax.
Jobbnorge.no
PhD position in Graph Neural Networks Modelling (198730) | NTNU - Norwegian University of Science and Technology
Job title: PhD position in Graph Neural Networks Modelling (198730), Employer: NTNU - Norwegian University of Science and Technology, Deadline: Closed
Fresh picks from ArXiv
This week on ArXiv: scaling GNNs to billions of edges, connection between label propagation and message passing, and GNNs for DAGs 🗡
Scalability
* Learning Massive Graph Embeddings on a Single Machine
* PyTorch-Direct: Enabling GPU Centric Data Access for Very Large Graph Neural Network Training with Irregular Accesses
Graph models
* Generating a Doppelganger Graph: Resembling but Distinct
* A Generalized Weisfeiler-Lehman Graph Kernel
* A Unifying Generative Model for Graph Learning Algorithms: Label Propagation, Graph Convolutions, and Combinations with Austin R. Benson
* Hypergraph clustering: from blockmodels to modularity with Austin R. Benson
ICLR 2021
* Boost then Convolve: Gradient Boosting Meets Graph Neural Networks with me
* Directed Acyclic Graph Neural Networks
WWW 2021
* How Do Hyperedges Overlap in Real-World Hypergraphs? -- Patterns, Measures, and Generators
Survey
* A Review of Graph Neural Networks and Their Applications in Power Systems
This week on ArXiv: scaling GNNs to billions of edges, connection between label propagation and message passing, and GNNs for DAGs 🗡
Scalability
* Learning Massive Graph Embeddings on a Single Machine
* PyTorch-Direct: Enabling GPU Centric Data Access for Very Large Graph Neural Network Training with Irregular Accesses
Graph models
* Generating a Doppelganger Graph: Resembling but Distinct
* A Generalized Weisfeiler-Lehman Graph Kernel
* A Unifying Generative Model for Graph Learning Algorithms: Label Propagation, Graph Convolutions, and Combinations with Austin R. Benson
* Hypergraph clustering: from blockmodels to modularity with Austin R. Benson
ICLR 2021
* Boost then Convolve: Gradient Boosting Meets Graph Neural Networks with me
* Directed Acyclic Graph Neural Networks
WWW 2021
* How Do Hyperedges Overlap in Real-World Hypergraphs? -- Patterns, Measures, and Generators
Survey
* A Review of Graph Neural Networks and Their Applications in Power Systems
Course: ODS Knowledge Graphs
Michael Galkin starts a self-paced course on knowledge graphs. For now, it's only in Russian, with the plan to make it in English after the first iteration. The first introduction lecture is available on YouTube. You can join discussion group for all your questions and proposals: @kg_course. The first lecture starts this Thursday, more in the channel @kg_course.
Course curriculum:
* Knowledge representations (RDF, RDFS, OWL)
* Storage and queries (SPARQL, Graph DBs)
* Consistency (RDF*, SHACL, ShEx)
* Semantic Data Integration
* Graph theory intro
* KG embeddings
* GNNs for KGs
* Applications: Question Answering, Query Embeddings
Michael Galkin starts a self-paced course on knowledge graphs. For now, it's only in Russian, with the plan to make it in English after the first iteration. The first introduction lecture is available on YouTube. You can join discussion group for all your questions and proposals: @kg_course. The first lecture starts this Thursday, more in the channel @kg_course.
Course curriculum:
* Knowledge representations (RDF, RDFS, OWL)
* Storage and queries (SPARQL, Graph DBs)
* Consistency (RDF*, SHACL, ShEx)
* Semantic Data Integration
* Graph theory intro
* KG embeddings
* GNNs for KGs
* Applications: Question Answering, Query Embeddings
YouTube
Михаил Галкин - Анонс курса Knowledge Graphs
Михаил и его коллеги подготовили курс по графам знаний ( Knowledge Graphs ) https://ods.ai/tracks/kgcourse2021
Будет дан краткий анонс курса и ответы на вопросы участников семинара.
Graph Representation Learning (GRL) - одна из самых быстро растущих тем…
Будет дан краткий анонс курса и ответы на вопросы участников семинара.
Graph Representation Learning (GRL) - одна из самых быстро растущих тем…
GNN User Group events
First event at GNN User Group organized by DGL team (Amazon) and CuGraph team (Nvidia) starts tomorrow. Events should be organized monthly. The first talk is "A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech)" + some networking event.
First event at GNN User Group organized by DGL team (Amazon) and CuGraph team (Nvidia) starts tomorrow. Events should be organized monthly. The first talk is "A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech)" + some networking event.
Eventbrite
Graph Neural Networks User Group
RoboGrammar: Graph Grammar for Terrain-Optimized Robot Design
(video) A recent work done at MIT for constructing different robot designs via graph grammar. Graph grammars were introduced in 1992 and defines a set of rules of transforming one graph to another. With this, a user can specify input robot components as well as the type of the terrain and graph grammar will produce possible robot designs. Next, a variation of A* algorithm is used to search for the optimal robot design for a given terrain. More on this in this article.
(video) A recent work done at MIT for constructing different robot designs via graph grammar. Graph grammars were introduced in 1992 and defines a set of rules of transforming one graph to another. With this, a user can specify input robot components as well as the type of the terrain and graph grammar will produce possible robot designs. Next, a variation of A* algorithm is used to search for the optimal robot design for a given terrain. More on this in this article.
YouTube
RoboGrammar: Graph Grammar for Terrain-Optimized Robot Design
ACM SIGGRAPH Asia 2020
https://cdfg.mit.edu/publications/robogrammar-graph-grammar-for-terrain-optimized-robot-design
https://cdfg.mit.edu/publications/robogrammar-graph-grammar-for-terrain-optimized-robot-design
CS224W: Machine Learning with Graphs 2021
CS224W is one of the most popular graph courses by Jure Leskovec at Stanford. This year includes extra topics such as label propagation, scalability of GNNs, and graph nets for science and biology. The slides for the first 6 out of 20 lectures are available.
CS224W is one of the most popular graph courses by Jure Leskovec at Stanford. This year includes extra topics such as label propagation, scalability of GNNs, and graph nets for science and biology. The slides for the first 6 out of 20 lectures are available.
Video: GNN User Group
The video from the first meeting of GNN user group talks about the usage and next release of DGL and featuring Le Song with combinatorial optimization talk.
The video from the first meeting of GNN user group talks about the usage and next release of DGL and featuring Le Song with combinatorial optimization talk.
YouTube
Graph Neural Networks User Group Meeting on Jan 28, 2021
Welcome to the first user group for Deep Graph Neural Networks!
We look forward to building community and sharing our interest in GNNs. While this is hosted by AWS and NVIDIA, all are welcome!
Learning about graphs has emerged as one of the hottest area…
We look forward to building community and sharing our interest in GNNs. While this is hosted by AWS and NVIDIA, all are welcome!
Learning about graphs has emerged as one of the hottest area…
GML Newsletter: Interpolation and Extrapolation of Graph Neural Networks
The new issue of the newsletter is about generalization of GNNs. Compared to the study of expressive power, there are fewer works about generalization. Nonetheless, I gathered the most exciting research I found on this topic, which I hope will familiarize you with this research direction.
The new issue of the newsletter is about generalization of GNNs. Compared to the study of expressive power, there are fewer works about generalization. Nonetheless, I gathered the most exciting research I found on this topic, which I hope will familiarize you with this research direction.
Substack
GML Newsletter: Interpolation and Extrapolation of Graph Neural Networks
A trend is a trend is a trend, But the question is, will it bend? Will it alter its course Through some unforeseen force And come to a premature end? -- Alexander Cairncross
Fresh picks from ArXiv
This week on ArXiv: tensorflow GNN library, survey on graph-based kNN search, and automation of peer review? 🧐
Conferences
Interpreting and Unifying Graph Neural Networks with An Optimization Framework WWW 2021
A Graph-based Relevance Matching Model for Ad-hoc Retrieval AAAI 2021
Software
Efficient Graph Deep Learning in TensorFlow with tf_geometric
Survey
* A Comprehensive Survey and Experimental Comparison of Graph-Based Approximate Nearest Neighbor Search
* Graph Neural Network for Traffic Forecasting: A Survey
* Can We Automate Scientific Reviewing?
This week on ArXiv: tensorflow GNN library, survey on graph-based kNN search, and automation of peer review? 🧐
Conferences
Interpreting and Unifying Graph Neural Networks with An Optimization Framework WWW 2021
A Graph-based Relevance Matching Model for Ad-hoc Retrieval AAAI 2021
Software
Efficient Graph Deep Learning in TensorFlow with tf_geometric
Survey
* A Comprehensive Survey and Experimental Comparison of Graph-Based Approximate Nearest Neighbor Search
* Graph Neural Network for Traffic Forecasting: A Survey
* Can We Automate Scientific Reviewing?
How many paths of length k exist in a graph?
In case you are preparing for the next interview, here is a nice post describing several solutions to a common interview problem: count the number of possible walks between two points in a graph. The problem is not as easy as it seems.
In case you are preparing for the next interview, here is a nice post describing several solutions to a common interview problem: count the number of possible walks between two points in a graph. The problem is not as easy as it seems.
Tutorial: Graph Neural Networks: Models and Applications
A new tutorial covering robustness, attacks, scalability and self-supervised learning for GNN models at AAAI 2021. Slides and video are available.
A new tutorial covering robustness, attacks, scalability and self-supervised learning for GNN models at AAAI 2021. Slides and video are available.
Sberloga Talk
In case you speak Russian I will be presenting today our ICLR 2021 work about combination of GBDT with GNN on graphs with tabular features. The talk will be 19-00 MSK time. Zoom link will be shared soon at @sberlogawithgraphs. For more videos from Sberloga, subscribe here: https://www.youtube.com/c/SBERLOGA
In case you speak Russian I will be presenting today our ICLR 2021 work about combination of GBDT with GNN on graphs with tabular features. The talk will be 19-00 MSK time. Zoom link will be shared soon at @sberlogawithgraphs. For more videos from Sberloga, subscribe here: https://www.youtube.com/c/SBERLOGA