On the evaluation of graph neural networks
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed and real results still exist and especially noticeable when the baselines are not properly selected.
For one using MLP only on node features often leads to better results than those from GNNs. This is surprising as GNNs can be seen as a generalization of MLP. I encounter this more and more on new data sets, although for several data sets (e.g. Cora) you can clearly see advantage of using GNNs.
Another ML model that I haven't seen being tried at graph settings is GBDT model (e.g. XGBoost, CatBoost, LightGBM). GBDT model are de-facto winners of many Kaggle competitions where the data is tabular, so you could expect if you have enough variability in your node features just using GBDT on them would often make a good baseline. I have tried this for several problems and it often outperforms the proposed method in the paper. For example, for node classification using GBDT on Bus data set achieves 100% accuracy (vs. ~80% in the paper). Or on graph classification GBDT can beat other top GNN models (see image below). Considering how easy it is to run experiments with GBDT models I would expect it would be a good counterpart to MLP in the realm of baselines.
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed and real results still exist and especially noticeable when the baselines are not properly selected.
For one using MLP only on node features often leads to better results than those from GNNs. This is surprising as GNNs can be seen as a generalization of MLP. I encounter this more and more on new data sets, although for several data sets (e.g. Cora) you can clearly see advantage of using GNNs.
Another ML model that I haven't seen being tried at graph settings is GBDT model (e.g. XGBoost, CatBoost, LightGBM). GBDT model are de-facto winners of many Kaggle competitions where the data is tabular, so you could expect if you have enough variability in your node features just using GBDT on them would often make a good baseline. I have tried this for several problems and it often outperforms the proposed method in the paper. For example, for node classification using GBDT on Bus data set achieves 100% accuracy (vs. ~80% in the paper). Or on graph classification GBDT can beat other top GNN models (see image below). Considering how easy it is to run experiments with GBDT models I would expect it would be a good counterpart to MLP in the realm of baselines.
Graph Machine Learning research groups: Danai Koutra
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michigan, and could be a great Ph.D. advisor if you are interested in GML.
Danai Koutra (~1988)
- Affiliation: University of Michigan
- Education: Ph.D. in Carnegie Mellon University in 2010 (advisor: Christos Faloutsos)
- h-index 25
- Awards: ACM SIGKDD 2016 Dissertation Award; best paper awards at ICDM, PAKDD, ICDT
- Interests: graph mining, knowledge graphs, graph embeddings
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michigan, and could be a great Ph.D. advisor if you are interested in GML.
Danai Koutra (~1988)
- Affiliation: University of Michigan
- Education: Ph.D. in Carnegie Mellon University in 2010 (advisor: Christos Faloutsos)
- h-index 25
- Awards: ACM SIGKDD 2016 Dissertation Award; best paper awards at ICDM, PAKDD, ICDT
- Interests: graph mining, knowledge graphs, graph embeddings
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Pietro Liรฒ
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liรฒ, a computational biologist and a supervisor of Petar Veliฤkoviฤ. He has also been very active in GMLโฆ
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liรฒ, a computational biologist and a supervisor of Petar Veliฤkoviฤ. He has also been very active in GMLโฆ
Latent graph neural networks: Manifold learning 2.0?
One of the hot topics of this year is construction of a graph from unstructured data (e.g. 3d points or images). In a new post Michael Bronstein discusses existing approaches to latent graph learning and suggests that using GNN both to learn the structure of the graph and to solve the downstream tasks can be a better alternative than a de-coupled approach. This is indeed an exciting and active area of research with open problems and known applications to NLP, physics, and biology.
One of the hot topics of this year is construction of a graph from unstructured data (e.g. 3d points or images). In a new post Michael Bronstein discusses existing approaches to latent graph learning and suggests that using GNN both to learn the structure of the graph and to solve the downstream tasks can be a better alternative than a de-coupled approach. This is indeed an exciting and active area of research with open problems and known applications to NLP, physics, and biology.
Medium
Latent graph neural networks: Manifold learning 2.0?
Can we use graph neural networks when the graph is unknown?
Fresh picks from ArXiv
This week on ArXiv: robot planning in the presence of many objects by researchers at MIT, a new SOTA on probabilistic type inference for software, application of GNN to trying clothes to different body shapes ๐
Applications
- Planning with Learned Object Importance in Large Problem Instances using Graph Neural Networks with Joshua Tenenbaum
- Advanced Graph-Based Deep Learning for Probabilistic Type Inference
- GINet: Graph Interaction Network for Scene Parsing ECCV 20
- Fully Convolutional Graph Neural Networks for Parametric Virtual Try-On SIGGRAPH 2020
- Addressing Cold Start in Recommender Systems with Hierarchical Graph Neural Networks
Theory
- Learning an Interpretable Graph Structure in Multi-Task Learning
- Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
- Adversarial Attack on Large Scale Graph
This week on ArXiv: robot planning in the presence of many objects by researchers at MIT, a new SOTA on probabilistic type inference for software, application of GNN to trying clothes to different body shapes ๐
Applications
- Planning with Learned Object Importance in Large Problem Instances using Graph Neural Networks with Joshua Tenenbaum
- Advanced Graph-Based Deep Learning for Probabilistic Type Inference
- GINet: Graph Interaction Network for Scene Parsing ECCV 20
- Fully Convolutional Graph Neural Networks for Parametric Virtual Try-On SIGGRAPH 2020
- Addressing Cold Start in Recommender Systems with Hierarchical Graph Neural Networks
Theory
- Learning an Interpretable Graph Structure in Multi-Task Learning
- Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
- Adversarial Attack on Large Scale Graph
Graph ML at Data Fest 2020
This year, together with @IggiSv9t, I organize a track at Data Fest 2020. It's like a workshop at the conference, but more informal. We will have videos from our amazing speakers and also networking, where you can speak to me, @IggiSv9t, speakers, or other people who are interested in graph machine learning. Besides our track there will be many other interesting tracks on all aspects of ML and DS (interpretability, antifraud, ML in healthcare, and 40 more tracks!).
It will be this weekend, 19-20 September. You need to be registered (for free) at https://fest.ai/2020/.
Our videos:
Day 1 (Saturday)
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany
Day 2 (Sunday)
1. Large Graph Visualization Tools and Approaches, Sviatoslav Kovalev, Samokat, Russia
2. Business Transformation as Graph Problems, Vadim Safronov, Key Points, Portugal
3. Scene Graph Generation from Images, Boris Knyazev, University of Guelph & Vector Institute, Canada
4. AutoGraph: Graphs Meet AutoML, Denis Vorotinsev, Oura, Finland
5. Link Prediction with Graph Neural Networks, Maxim Panov, Skoltech, Russia
See you there!
This year, together with @IggiSv9t, I organize a track at Data Fest 2020. It's like a workshop at the conference, but more informal. We will have videos from our amazing speakers and also networking, where you can speak to me, @IggiSv9t, speakers, or other people who are interested in graph machine learning. Besides our track there will be many other interesting tracks on all aspects of ML and DS (interpretability, antifraud, ML in healthcare, and 40 more tracks!).
It will be this weekend, 19-20 September. You need to be registered (for free) at https://fest.ai/2020/.
Our videos:
Day 1 (Saturday)
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany
Day 2 (Sunday)
1. Large Graph Visualization Tools and Approaches, Sviatoslav Kovalev, Samokat, Russia
2. Business Transformation as Graph Problems, Vadim Safronov, Key Points, Portugal
3. Scene Graph Generation from Images, Boris Knyazev, University of Guelph & Vector Institute, Canada
4. AutoGraph: Graphs Meet AutoML, Denis Vorotinsev, Oura, Finland
5. Link Prediction with Graph Neural Networks, Maxim Panov, Skoltech, Russia
See you there!
fest.ai
September 19 and 20, Data Fest Online 2020
Main Data Science and Machine Learning community-organized conference in 2020
On Cora dataset
Cora, Citeseer, and Pubmed are three popular data sets for node classification. It's one of those cases where you can clearly see the power of GNN. For example, on Cora GNNs have around 80% accuracy, while GBDT/MLP have only around 60%. This is not often the case: for many data sets I can see marginal win for GNN compared to non-graph methods and for some data sets it's actually lower.
So why the performance of GNN is so great on this data set? I don't have a good answer for this, but here are some thoughts. Cora is a citation network, where nodes are papers and classes are papers' field. However, it's not clear what are the links between this documents. The original paper didn't describe how exactly links are established. If links were based on citation, i.e. two papers are connected if they have a citation from one to another, then it could explain such big improvement of GNN: GNN explore all nodes during training, while MLP only training nodes and since two papers likely to share the same field, GNN leverage this graph information. If that's the case simple k-nn majority vote baseline would be performing similar to GNN. However, there is an opinion from people who know the authors of the original paper saying that the links are established based on word similarity between documents. If that's true, I'm not sure why GNN is doing so well for this data set. In all cases, establishing the graphs from real-world data is something that requires a lot of attention and visibility, that's why structure learning is such an active topic.
Cora, Citeseer, and Pubmed are three popular data sets for node classification. It's one of those cases where you can clearly see the power of GNN. For example, on Cora GNNs have around 80% accuracy, while GBDT/MLP have only around 60%. This is not often the case: for many data sets I can see marginal win for GNN compared to non-graph methods and for some data sets it's actually lower.
So why the performance of GNN is so great on this data set? I don't have a good answer for this, but here are some thoughts. Cora is a citation network, where nodes are papers and classes are papers' field. However, it's not clear what are the links between this documents. The original paper didn't describe how exactly links are established. If links were based on citation, i.e. two papers are connected if they have a citation from one to another, then it could explain such big improvement of GNN: GNN explore all nodes during training, while MLP only training nodes and since two papers likely to share the same field, GNN leverage this graph information. If that's the case simple k-nn majority vote baseline would be performing similar to GNN. However, there is an opinion from people who know the authors of the original paper saying that the links are established based on word similarity between documents. If that's true, I'm not sure why GNN is doing so well for this data set. In all cases, establishing the graphs from real-world data is something that requires a lot of attention and visibility, that's why structure learning is such an active topic.
Telegram
Graph Machine Learning
On the evaluation of graph neural networks
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed andโฆ
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed andโฆ
Graph ML at Data Fest 2020
Day 1 was a pleasant surprise: people with different background came, watched videos, and asked questions. Here are 5 videos of day 1:
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France (where I broadly talk about what is GML, what are the best resources, what's the community, etc.);
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia (where she spoke about her k-NN on graphs, HNSW, theory and her ICML 20 work);
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia (where he spoke about graphical models, treewidth, tensor decomposition);
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany (where he spoke about all popular node embeddings methods and what their pros and cons);
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany (it's all you need to know about knowledge graphs if you don't know what they are).
On day 2, tomorrow, we will have 5 more videos, which would be about applications of graphs.
Please, join us tomorrow at https://spatial.chat/s/ods at 12pm (Moscow time).
Day 1 was a pleasant surprise: people with different background came, watched videos, and asked questions. Here are 5 videos of day 1:
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France (where I broadly talk about what is GML, what are the best resources, what's the community, etc.);
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia (where she spoke about her k-NN on graphs, HNSW, theory and her ICML 20 work);
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia (where he spoke about graphical models, treewidth, tensor decomposition);
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany (where he spoke about all popular node embeddings methods and what their pros and cons);
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany (it's all you need to know about knowledge graphs if you don't know what they are).
On day 2, tomorrow, we will have 5 more videos, which would be about applications of graphs.
Please, join us tomorrow at https://spatial.chat/s/ods at 12pm (Moscow time).
YouTube
Sergey Ivanov: Graph Machine Learning
Data Fest Online 2020
Graph ML track: https://ods.ai/tracks/graph-ml-df2020
Speaker: Sergey Ivanov (Criteo)
Opening remarks: Graph Machine Learning
Graph Machine Learning is the science between graph theory and machine learning. It is now a very activeโฆ
Graph ML track: https://ods.ai/tracks/graph-ml-df2020
Speaker: Sergey Ivanov (Criteo)
Opening remarks: Graph Machine Learning
Graph Machine Learning is the science between graph theory and machine learning. It is now a very activeโฆ
Graph ML at Data Fest 2020
Day 2 continued to surprise me as many people have joined on Sunday to listen to our talks. Especially interesting it was to see English-speaking participants who were not humble to ask questions and be present among so many Russian speakers. I see this English activity as a promising step in making ODS community truly global.
Here is the second portion of videos, more related to applications of graphs.
1. Large Graph Visualization Tools and Approaches Sviatoslav Kovalev, Samokat, Russia
2. Business Transformation as Graph Problems Vadim Safronov, Key Points, Portugal
3. AutoGraph: Graphs Meet AutoML Denis Vorotinsev, Oura, Finland
4. Scene Graph Generation from Images Boris Knyazev, University of Guelph & Vector Institute, Canada
5. Link Prediction with Graph Neural Networks Maxim Panov, Skoltech, Russia
My gratitude to all the speakers!
Until next time!
Day 2 continued to surprise me as many people have joined on Sunday to listen to our talks. Especially interesting it was to see English-speaking participants who were not humble to ask questions and be present among so many Russian speakers. I see this English activity as a promising step in making ODS community truly global.
Here is the second portion of videos, more related to applications of graphs.
1. Large Graph Visualization Tools and Approaches Sviatoslav Kovalev, Samokat, Russia
2. Business Transformation as Graph Problems Vadim Safronov, Key Points, Portugal
3. AutoGraph: Graphs Meet AutoML Denis Vorotinsev, Oura, Finland
4. Scene Graph Generation from Images Boris Knyazev, University of Guelph & Vector Institute, Canada
5. Link Prediction with Graph Neural Networks Maxim Panov, Skoltech, Russia
My gratitude to all the speakers!
Until next time!
YouTube
Sviatoslav Kovalev: Large Graph Visualization Tools and Approaches
Data Fest Online 2020
Graph ML track: https://ods.ai/tracks/graph-ml-df2020
Speaker: Sviatoslav Kovalev (Samokat)
Large Graph Visualization Tools and Approaches
This presentation is a kind of survey about problems one faces when drawing huge graphs, howโฆ
Graph ML track: https://ods.ai/tracks/graph-ml-df2020
Speaker: Sviatoslav Kovalev (Samokat)
Large Graph Visualization Tools and Approaches
This presentation is a kind of survey about problems one faces when drawing huge graphs, howโฆ
GNN course at UPenn
In addition to cs224w at Stanford and COMP 766 at McGill (both should happen next semester), there is a good-looking currently ongoing course on Graph Neural Networks at University of Pennsylvania by Alejandro Ribeiro, who worked on graph ML and graph signal processing. This is a third week and there are already videos and assignments about graph convolutional filters, empirical risk minimization, and introduction to the field.
In addition to cs224w at Stanford and COMP 766 at McGill (both should happen next semester), there is a good-looking currently ongoing course on Graph Neural Networks at University of Pennsylvania by Alejandro Ribeiro, who worked on graph ML and graph signal processing. This is a third week and there are already videos and assignments about graph convolutional filters, empirical risk minimization, and introduction to the field.
gnn.seas.upenn.edu
Graph Neural Networks โ ESE 5140
17th Workshop on Algorithms and Models for the Web Graph
There is a pretty interesting workshop on graph theory and its application web graph. There are 5 talks each day, from 21 (today) to 24 Sept. The conference will be held online.
There is a pretty interesting workshop on graph theory and its application web graph. There are 5 talks each day, from 21 (today) to 24 Sept. The conference will be held online.
Fresh picks from ArXiv
This week on ArXiV is an application of GNN to COVID forecasting, anew graph to sequence algo for machine translation, and a scikit library for network analytics โ๏ธ
GNN
- Recurrent Graph Tensor Networks
- Image Retrieval for Structure-from-Motion via Graph Convolutional Network
- United We Stand: Transfer Graph Neural Networks for Pandemic Forecasting
KG
- Inductive Learning on Commonsense Knowledge Graph Completion with Jure Leskovec
- Type-augmented Relation Prediction in Knowledge Graphs
NLP
- Question Directed Graph Attention Network for Numerical Reasoning over Text EMNLP 20
- Graph-to-Sequence Neural Machine Translation
Software
- Scikit-network: Graph Analysis in Python
This week on ArXiV is an application of GNN to COVID forecasting, anew graph to sequence algo for machine translation, and a scikit library for network analytics โ๏ธ
GNN
- Recurrent Graph Tensor Networks
- Image Retrieval for Structure-from-Motion via Graph Convolutional Network
- United We Stand: Transfer Graph Neural Networks for Pandemic Forecasting
KG
- Inductive Learning on Commonsense Knowledge Graph Completion with Jure Leskovec
- Type-augmented Relation Prediction in Knowledge Graphs
NLP
- Question Directed Graph Attention Network for Numerical Reasoning over Text EMNLP 20
- Graph-to-Sequence Neural Machine Translation
Software
- Scikit-network: Graph Analysis in Python
PhD Thesis on Graph Machine Learning
Here are some PhD dissertations on GML. Part 3 (previous here).
Xiaowen Dong: Multi-view signal processing and learning on graphs (EPFL 2014)
Yan Leng: Collective behavior over social networks with data-driven and machine learning models (MIT 2020)
Davide Boscaini: Geometric Deep Learning for Shape Analysis (Universitร della Svizzera Italiana 2017)
Here are some PhD dissertations on GML. Part 3 (previous here).
Xiaowen Dong: Multi-view signal processing and learning on graphs (EPFL 2014)
Yan Leng: Collective behavior over social networks with data-driven and machine learning models (MIT 2020)
Davide Boscaini: Geometric Deep Learning for Shape Analysis (Universitร della Svizzera Italiana 2017)
Telegram
Graph Machine Learning
PhD Theses on Graph Machine Learning
Here are some PhD dissertations on GML. Part 2 (previous here).
Haggai Marron: Deep and Convex Shape Analysis
Benoit Playe: Machine learning approaches for drug virtual screening
Here are some PhD dissertations on GML. Part 2 (previous here).
Haggai Marron: Deep and Convex Shape Analysis
Benoit Playe: Machine learning approaches for drug virtual screening
3DGV Seminar: Michael Bronstein
There is a good ongoing seminar on 3D geometry and vision. Last seminar was presented by Michael Bronstein who was talking about inductive biases, timeline of GNN architectures, and several successful applications. Quite insightful.
There is a good ongoing seminar on 3D geometry and vision. Last seminar was presented by Michael Bronstein who was talking about inductive biases, timeline of GNN architectures, and several successful applications. Quite insightful.
YouTube
3DGV Seminar: Michael Bronstein -- Geometric Deep Learning for 3D Shape Analysis and Synthesis
Message Passing for Hyper-Relational Knowledge Graphs
This is a guest post by Michael Galkin about their recently accepted paper at EMNLP.
Traditionally, knowledge graphs (KGs) use triples to encode their facts, eg
If we have the two facts:
It is a common problem of triple-based KGs when we want to assign more attributes to each typed edge. Luckily, the KG community has two good ways to do that: with RDF* and Labeled Property Graphs (LPGs). With RDF* we could instantiate each fact with qualifiers:
Interestingly, there is pretty much nothing ๐ณ in the Graph ML field for hyper-relational graphs. We have a bunch of GNN encoders for directed, multi-relational, triple-based KGs (like R-GCN or CompGCN), and nothing for hyper-relational ones.
In our new paper, we design StarE โญ๏ธ, a GNN encoder for hyper-relational KGs (like RDF* or LPG) where each edge might have unlimited amount of qualifier pairs (relation, entity). Moreover, those entities and relations do not need to be qualifier-specific, they can be used in the main triples as well!
In addition, we carefully constructed WD50K, a new Wikidata-based dataset for link predicion on hyper-relational KGs, and its 3 decendants for various setups. Experiments show that qualifiers greatly improve subject/object prediction accuracy, sometimes reaching a whopping 25 MRR points gap. More applications and tasks are to appear in the future work!
Paper: https://arxiv.org/abs/2009.10847
Blog: Medium friends link
Code: Github
This is a guest post by Michael Galkin about their recently accepted paper at EMNLP.
Traditionally, knowledge graphs (KGs) use triples to encode their facts, eg
subject, predicate, objectSimple and straighforward, triple-based KG are extensively used in a plethora of NLP and CV tasks. But can triples effectively encode richer facts when we need them?
Albert Einstein, educated at, ETH Zurich
If we have the two facts:
Albert Einstein, educated at, ETH Zurichwhat can we say about Einstein's education? Did he attend two universities at the same time? ๐คจ
Albert Einstein, educated at, University of Zurich
It is a common problem of triple-based KGs when we want to assign more attributes to each typed edge. Luckily, the KG community has two good ways to do that: with RDF* and Labeled Property Graphs (LPGs). With RDF* we could instantiate each fact with qualifiers:
( Albert_Einstein educated_at ETH_Zurich )We call such KGs as hyper-relational KGs. Wikidata follows the same model, here is Einstein's page where you'd find statements (hyper-relational facts) with qualifiers (those additional key-value edge attributes).
academic_degree Bachelor ;
academic_major Maths .
( Albert_Einstein educated_at University_of_Zurich )
academic_degree Doctorate ;
academic_major Physics.
Interestingly, there is pretty much nothing ๐ณ in the Graph ML field for hyper-relational graphs. We have a bunch of GNN encoders for directed, multi-relational, triple-based KGs (like R-GCN or CompGCN), and nothing for hyper-relational ones.
In our new paper, we design StarE โญ๏ธ, a GNN encoder for hyper-relational KGs (like RDF* or LPG) where each edge might have unlimited amount of qualifier pairs (relation, entity). Moreover, those entities and relations do not need to be qualifier-specific, they can be used in the main triples as well!
In addition, we carefully constructed WD50K, a new Wikidata-based dataset for link predicion on hyper-relational KGs, and its 3 decendants for various setups. Experiments show that qualifiers greatly improve subject/object prediction accuracy, sometimes reaching a whopping 25 MRR points gap. More applications and tasks are to appear in the future work!
Paper: https://arxiv.org/abs/2009.10847
Blog: Medium friends link
Code: Github
www.wikidata.org
Albert Einstein
German-born theoretical physicist
Graph Machine Learning research groups: Alejandro Ribeiro
I do a series of posts on the groups in graph research, previous post is here. The 15th is Alejandro Ribeiro, head of Alelab at UPenn and the leading author of the ongoing GNN course.
Alejandro Ribeiro (1975)
- Affiliation: University of Pennsylvania
- Education: Ph.D. in University of Minnesota in 2006 (advisor: Georgios B. Giannakis)
- h-index 51
- Awards: Hugo Schuck best paper award, paper awards at CDC, ACC, ICASSP, Lindback award, NSF award
- Interests: wireless autonomous networks, machine learning on network data, distributed collaborative learning
I do a series of posts on the groups in graph research, previous post is here. The 15th is Alejandro Ribeiro, head of Alelab at UPenn and the leading author of the ongoing GNN course.
Alejandro Ribeiro (1975)
- Affiliation: University of Pennsylvania
- Education: Ph.D. in University of Minnesota in 2006 (advisor: Georgios B. Giannakis)
- h-index 51
- Awards: Hugo Schuck best paper award, paper awards at CDC, ACC, ICASSP, Lindback award, NSF award
- Interests: wireless autonomous networks, machine learning on network data, distributed collaborative learning
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Danai Koutra
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michiganโฆ
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michiganโฆ
NeurIPS 2020 stats
Dates: Dec 6 - 12
Where: Online
Price: $25/$100 (students/non-students)
โข 9454 submissions (vs 6743 in 2019)
โข 1900 accepted (vs 1428 in 2019)
โข 20.1% acceptance rate (vs 21% in 2019)
โข 123 graph papers (6.5% of total)
Dates: Dec 6 - 12
Where: Online
Price: $25/$100 (students/non-students)
โข 9454 submissions (vs 6743 in 2019)
โข 1900 accepted (vs 1428 in 2019)
โข 20.1% acceptance rate (vs 21% in 2019)
โข 123 graph papers (6.5% of total)
Fresh picks from ArXiv
Many papers caught my attention this week (and it's not because of NeurIPS), very interesting stuff: debunking value of scene graphs, extrapolation of GNNs, GraphNorm, Alibaba KG construction, closed formulas for graphlets, and applications to river dynamics ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Are scene graphs good enough to improve Image Captioning? AACL 2020
- Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph EMNLP 2020
- Structure Aware Negative Sampling in Knowledge Graphs EMNLP 2020 with William L. Hamilton
- Message Passing for Hyper-Relational Knowledge Graphs EMNLP 2020 with Michael Galkin
- Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning ICDM 2020
- Graph neural induction of value iteration GRL+ 2020
- Heterogeneous Molecular Graph Neural Networks for Predicting Molecule Properties ICDM 2020
GNN
- How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks with Stefanie Jegelka
- Learning Graph Normalization for Graph Neural Networks
Applications
- Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
- SIA-GCN: A Spatial Information Aware Graph Neural Network with 2D Convolutions for Hand Pose Estimation
Industry
- AliMe KG: Domain Knowledge Graph Construction and Application in E-commerce
Math
- Counting five-node subgraphs
Survey
- A survey of graph burning
Many papers caught my attention this week (and it's not because of NeurIPS), very interesting stuff: debunking value of scene graphs, extrapolation of GNNs, GraphNorm, Alibaba KG construction, closed formulas for graphlets, and applications to river dynamics ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Are scene graphs good enough to improve Image Captioning? AACL 2020
- Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph EMNLP 2020
- Structure Aware Negative Sampling in Knowledge Graphs EMNLP 2020 with William L. Hamilton
- Message Passing for Hyper-Relational Knowledge Graphs EMNLP 2020 with Michael Galkin
- Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning ICDM 2020
- Graph neural induction of value iteration GRL+ 2020
- Heterogeneous Molecular Graph Neural Networks for Predicting Molecule Properties ICDM 2020
GNN
- How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks with Stefanie Jegelka
- Learning Graph Normalization for Graph Neural Networks
Applications
- Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
- SIA-GCN: A Spatial Information Aware Graph Neural Network with 2D Convolutions for Hand Pose Estimation
Industry
- AliMe KG: Domain Knowledge Graph Construction and Application in E-commerce
Math
- Counting five-node subgraphs
Survey
- A survey of graph burning