17th Workshop on Algorithms and Models for the Web Graph
There is a pretty interesting workshop on graph theory and its application web graph. There are 5 talks each day, from 21 (today) to 24 Sept. The conference will be held online.
There is a pretty interesting workshop on graph theory and its application web graph. There are 5 talks each day, from 21 (today) to 24 Sept. The conference will be held online.
Fresh picks from ArXiv
This week on ArXiV is an application of GNN to COVID forecasting, anew graph to sequence algo for machine translation, and a scikit library for network analytics ✍️
GNN
- Recurrent Graph Tensor Networks
- Image Retrieval for Structure-from-Motion via Graph Convolutional Network
- United We Stand: Transfer Graph Neural Networks for Pandemic Forecasting
KG
- Inductive Learning on Commonsense Knowledge Graph Completion with Jure Leskovec
- Type-augmented Relation Prediction in Knowledge Graphs
NLP
- Question Directed Graph Attention Network for Numerical Reasoning over Text EMNLP 20
- Graph-to-Sequence Neural Machine Translation
Software
- Scikit-network: Graph Analysis in Python
This week on ArXiV is an application of GNN to COVID forecasting, anew graph to sequence algo for machine translation, and a scikit library for network analytics ✍️
GNN
- Recurrent Graph Tensor Networks
- Image Retrieval for Structure-from-Motion via Graph Convolutional Network
- United We Stand: Transfer Graph Neural Networks for Pandemic Forecasting
KG
- Inductive Learning on Commonsense Knowledge Graph Completion with Jure Leskovec
- Type-augmented Relation Prediction in Knowledge Graphs
NLP
- Question Directed Graph Attention Network for Numerical Reasoning over Text EMNLP 20
- Graph-to-Sequence Neural Machine Translation
Software
- Scikit-network: Graph Analysis in Python
PhD Thesis on Graph Machine Learning
Here are some PhD dissertations on GML. Part 3 (previous here).
Xiaowen Dong: Multi-view signal processing and learning on graphs (EPFL 2014)
Yan Leng: Collective behavior over social networks with data-driven and machine learning models (MIT 2020)
Davide Boscaini: Geometric Deep Learning for Shape Analysis (Università della Svizzera Italiana 2017)
Here are some PhD dissertations on GML. Part 3 (previous here).
Xiaowen Dong: Multi-view signal processing and learning on graphs (EPFL 2014)
Yan Leng: Collective behavior over social networks with data-driven and machine learning models (MIT 2020)
Davide Boscaini: Geometric Deep Learning for Shape Analysis (Università della Svizzera Italiana 2017)
Telegram
Graph Machine Learning
PhD Theses on Graph Machine Learning
Here are some PhD dissertations on GML. Part 2 (previous here).
Haggai Marron: Deep and Convex Shape Analysis
Benoit Playe: Machine learning approaches for drug virtual screening
Here are some PhD dissertations on GML. Part 2 (previous here).
Haggai Marron: Deep and Convex Shape Analysis
Benoit Playe: Machine learning approaches for drug virtual screening
3DGV Seminar: Michael Bronstein
There is a good ongoing seminar on 3D geometry and vision. Last seminar was presented by Michael Bronstein who was talking about inductive biases, timeline of GNN architectures, and several successful applications. Quite insightful.
There is a good ongoing seminar on 3D geometry and vision. Last seminar was presented by Michael Bronstein who was talking about inductive biases, timeline of GNN architectures, and several successful applications. Quite insightful.
YouTube
3DGV Seminar: Michael Bronstein -- Geometric Deep Learning for 3D Shape Analysis and Synthesis
Message Passing for Hyper-Relational Knowledge Graphs
This is a guest post by Michael Galkin about their recently accepted paper at EMNLP.
Traditionally, knowledge graphs (KGs) use triples to encode their facts, eg
If we have the two facts:
It is a common problem of triple-based KGs when we want to assign more attributes to each typed edge. Luckily, the KG community has two good ways to do that: with RDF* and Labeled Property Graphs (LPGs). With RDF* we could instantiate each fact with qualifiers:
Interestingly, there is pretty much nothing 🕳 in the Graph ML field for hyper-relational graphs. We have a bunch of GNN encoders for directed, multi-relational, triple-based KGs (like R-GCN or CompGCN), and nothing for hyper-relational ones.
In our new paper, we design StarE ⭐️, a GNN encoder for hyper-relational KGs (like RDF* or LPG) where each edge might have unlimited amount of qualifier pairs (relation, entity). Moreover, those entities and relations do not need to be qualifier-specific, they can be used in the main triples as well!
In addition, we carefully constructed WD50K, a new Wikidata-based dataset for link predicion on hyper-relational KGs, and its 3 decendants for various setups. Experiments show that qualifiers greatly improve subject/object prediction accuracy, sometimes reaching a whopping 25 MRR points gap. More applications and tasks are to appear in the future work!
Paper: https://arxiv.org/abs/2009.10847
Blog: Medium friends link
Code: Github
This is a guest post by Michael Galkin about their recently accepted paper at EMNLP.
Traditionally, knowledge graphs (KGs) use triples to encode their facts, eg
subject, predicate, objectSimple and straighforward, triple-based KG are extensively used in a plethora of NLP and CV tasks. But can triples effectively encode richer facts when we need them?
Albert Einstein, educated at, ETH Zurich
If we have the two facts:
Albert Einstein, educated at, ETH Zurichwhat can we say about Einstein's education? Did he attend two universities at the same time? 🤨
Albert Einstein, educated at, University of Zurich
It is a common problem of triple-based KGs when we want to assign more attributes to each typed edge. Luckily, the KG community has two good ways to do that: with RDF* and Labeled Property Graphs (LPGs). With RDF* we could instantiate each fact with qualifiers:
( Albert_Einstein educated_at ETH_Zurich )We call such KGs as hyper-relational KGs. Wikidata follows the same model, here is Einstein's page where you'd find statements (hyper-relational facts) with qualifiers (those additional key-value edge attributes).
academic_degree Bachelor ;
academic_major Maths .
( Albert_Einstein educated_at University_of_Zurich )
academic_degree Doctorate ;
academic_major Physics.
Interestingly, there is pretty much nothing 🕳 in the Graph ML field for hyper-relational graphs. We have a bunch of GNN encoders for directed, multi-relational, triple-based KGs (like R-GCN or CompGCN), and nothing for hyper-relational ones.
In our new paper, we design StarE ⭐️, a GNN encoder for hyper-relational KGs (like RDF* or LPG) where each edge might have unlimited amount of qualifier pairs (relation, entity). Moreover, those entities and relations do not need to be qualifier-specific, they can be used in the main triples as well!
In addition, we carefully constructed WD50K, a new Wikidata-based dataset for link predicion on hyper-relational KGs, and its 3 decendants for various setups. Experiments show that qualifiers greatly improve subject/object prediction accuracy, sometimes reaching a whopping 25 MRR points gap. More applications and tasks are to appear in the future work!
Paper: https://arxiv.org/abs/2009.10847
Blog: Medium friends link
Code: Github
www.wikidata.org
Albert Einstein
German-born theoretical physicist
Graph Machine Learning research groups: Alejandro Ribeiro
I do a series of posts on the groups in graph research, previous post is here. The 15th is Alejandro Ribeiro, head of Alelab at UPenn and the leading author of the ongoing GNN course.
Alejandro Ribeiro (1975)
- Affiliation: University of Pennsylvania
- Education: Ph.D. in University of Minnesota in 2006 (advisor: Georgios B. Giannakis)
- h-index 51
- Awards: Hugo Schuck best paper award, paper awards at CDC, ACC, ICASSP, Lindback award, NSF award
- Interests: wireless autonomous networks, machine learning on network data, distributed collaborative learning
I do a series of posts on the groups in graph research, previous post is here. The 15th is Alejandro Ribeiro, head of Alelab at UPenn and the leading author of the ongoing GNN course.
Alejandro Ribeiro (1975)
- Affiliation: University of Pennsylvania
- Education: Ph.D. in University of Minnesota in 2006 (advisor: Georgios B. Giannakis)
- h-index 51
- Awards: Hugo Schuck best paper award, paper awards at CDC, ACC, ICASSP, Lindback award, NSF award
- Interests: wireless autonomous networks, machine learning on network data, distributed collaborative learning
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Danai Koutra
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michigan…
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michigan…
NeurIPS 2020 stats
Dates: Dec 6 - 12
Where: Online
Price: $25/$100 (students/non-students)
• 9454 submissions (vs 6743 in 2019)
• 1900 accepted (vs 1428 in 2019)
• 20.1% acceptance rate (vs 21% in 2019)
• 123 graph papers (6.5% of total)
Dates: Dec 6 - 12
Where: Online
Price: $25/$100 (students/non-students)
• 9454 submissions (vs 6743 in 2019)
• 1900 accepted (vs 1428 in 2019)
• 20.1% acceptance rate (vs 21% in 2019)
• 123 graph papers (6.5% of total)
Fresh picks from ArXiv
Many papers caught my attention this week (and it's not because of NeurIPS), very interesting stuff: debunking value of scene graphs, extrapolation of GNNs, GraphNorm, Alibaba KG construction, closed formulas for graphlets, and applications to river dynamics 🌊
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Are scene graphs good enough to improve Image Captioning? AACL 2020
- Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph EMNLP 2020
- Structure Aware Negative Sampling in Knowledge Graphs EMNLP 2020 with William L. Hamilton
- Message Passing for Hyper-Relational Knowledge Graphs EMNLP 2020 with Michael Galkin
- Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning ICDM 2020
- Graph neural induction of value iteration GRL+ 2020
- Heterogeneous Molecular Graph Neural Networks for Predicting Molecule Properties ICDM 2020
GNN
- How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks with Stefanie Jegelka
- Learning Graph Normalization for Graph Neural Networks
Applications
- Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
- SIA-GCN: A Spatial Information Aware Graph Neural Network with 2D Convolutions for Hand Pose Estimation
Industry
- AliMe KG: Domain Knowledge Graph Construction and Application in E-commerce
Math
- Counting five-node subgraphs
Survey
- A survey of graph burning
Many papers caught my attention this week (and it's not because of NeurIPS), very interesting stuff: debunking value of scene graphs, extrapolation of GNNs, GraphNorm, Alibaba KG construction, closed formulas for graphlets, and applications to river dynamics 🌊
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Are scene graphs good enough to improve Image Captioning? AACL 2020
- Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph EMNLP 2020
- Structure Aware Negative Sampling in Knowledge Graphs EMNLP 2020 with William L. Hamilton
- Message Passing for Hyper-Relational Knowledge Graphs EMNLP 2020 with Michael Galkin
- Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning ICDM 2020
- Graph neural induction of value iteration GRL+ 2020
- Heterogeneous Molecular Graph Neural Networks for Predicting Molecule Properties ICDM 2020
GNN
- How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks with Stefanie Jegelka
- Learning Graph Normalization for Graph Neural Networks
Applications
- Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
- SIA-GCN: A Spatial Information Aware Graph Neural Network with 2D Convolutions for Hand Pose Estimation
Industry
- AliMe KG: Domain Knowledge Graph Construction and Application in E-commerce
Math
- Counting five-node subgraphs
Survey
- A survey of graph burning
SE(3)-Transformers
A blog post about a recent paper (NeurIPS 2020) that introduces group theory to set functions. It seems like it performs on par with state-of-the-art methods for classification and regression, but at least is provably equivariant.
A blog post about a recent paper (NeurIPS 2020) that introduces group theory to set functions. It seems like it performs on par with state-of-the-art methods for classification and regression, but at least is provably equivariant.
fabianfuchsml.github.io
SE(3)-Transformer
# SE(3)-Transformers: 3D Roto-Translation Equivariant Attention Networks #### NeurIPS 2020: [join us](https://neurips.cc/virtual/2020/protected/poster_15231a7ce4ba789d13b722cc5c955834.html) at poster session 6, Thursday 5pm GMT *Authors: [Fabian Fuchs\*]…
NYC Deep Learning Course: Structured Prediction
Final lecture of the course on deep learning led by Yann LeCun. It covers structured prediction, energy-based factor graphs, and graph transformer networks.
Final lecture of the course on deep learning led by Yann LeCun. It covers structured prediction, energy-based factor graphs, and graph transformer networks.
YouTube
Week 14 – Lecture: Structured prediction with energy based models
Course website: https://bit.ly/pDL-home
Playlist: https://bit.ly/pDL-YouTube
Speaker: Yann LeCun
Week 14: https://bit.ly/pDL-en-14
0:00:00 – Week 14 – Lecture
LECTURE Part A: https://bit.ly/pDL-en-14-1
In this section, we discussed the structured prediction.…
Playlist: https://bit.ly/pDL-YouTube
Speaker: Yann LeCun
Week 14: https://bit.ly/pDL-en-14
0:00:00 – Week 14 – Lecture
LECTURE Part A: https://bit.ly/pDL-en-14-1
In this section, we discussed the structured prediction.…
The next big thing: the use of graph neural networks to discover particles
It's great to see that GNNs can be useful for fundamental applications such as new particles discovery. In another post by Fermilab, US-based physics lab, researchers discuss that they are able to move GNNs to production for Large Hadron Collider (LHC) at CERN. The goal is to process millions of images and select those that could be relevant to discovery of new particles. They expect to see the results in LHC's Run 3 in 2021. ArXiv preprint is available online.
It's great to see that GNNs can be useful for fundamental applications such as new particles discovery. In another post by Fermilab, US-based physics lab, researchers discuss that they are able to move GNNs to production for Large Hadron Collider (LHC) at CERN. The goal is to process millions of images and select those that could be relevant to discovery of new particles. They expect to see the results in LHC's Run 3 in 2021. ArXiv preprint is available online.
News
The next big thing: the use of graph neural networks to discover particles
Fermilab scientists have implemented a cloud-based machine learning framework to handle data from the CMS experiment at the Large Hadron Collider. Now they can begin to use graph neural networks to boost their pattern recognition abilities in the search for…
ICLR 2021 Graph Papers
Last Friday submissions to ICLR 2021 became available for reading. There are 3013 submissions, about 210 graph papers (7% of total). About every third paper came from rejection of NeurIPS (which is based on overlap of paper submissions), which surprised me not just on sheer volume, but also because I'm puzzled where the remaining 6000 rejected papers are resubmitted to.
I extracted graph papers, which are attached, and categorized them loosely in 4 topics: model, theory, application, and survey. Most of the papers (171) are about new models (general GNNs, graph models for new problems, improvements over existing models). 22 papers are novel applications in physics, chemistry, biology, etc. 13 are theoretical papers, and 4 are surveys/evaluation benchmarks.
Last Friday submissions to ICLR 2021 became available for reading. There are 3013 submissions, about 210 graph papers (7% of total). About every third paper came from rejection of NeurIPS (which is based on overlap of paper submissions), which surprised me not just on sheer volume, but also because I'm puzzled where the remaining 6000 rejected papers are resubmitted to.
I extracted graph papers, which are attached, and categorized them loosely in 4 topics: model, theory, application, and survey. Most of the papers (171) are about new models (general GNNs, graph models for new problems, improvements over existing models). 22 papers are novel applications in physics, chemistry, biology, etc. 13 are theoretical papers, and 4 are surveys/evaluation benchmarks.
OpenReview
ICLR 2021 Conference
Welcome to the OpenReview homepage for ICLR 2021 Conference
Fresh picks from ArXiv
Today at ArXiv: GNNs rescue NLP, power of random initialization, and a survey on computation of GNNs 🏭
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Towards Interpretable Reasoning over Paragraph Effects in Situation EMNLP 2020
- Double Graph Based Reasoning for Document-level Relation Extraction EMNLP 2020
- Neural Topic Modeling by Incorporating Document Relationship Graph EMNLP 2020
- GraphDialog: Integrating Graph Knowledge into End-to-End Task-Oriented Dialogue Systems EMNLP 2020
- Knowledge-Enhanced Personalized Review Generation with Capsule Graph Neural Network CIKM 2020
- Knowledge Graph Embeddings in Geometric Algebras COLING 2020
- TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation COLING 2020
GNNs
- The Surprising Power of Graph Neural Networks with Random Node Initialization
- Interpreting Graph Neural Networks for NLP With Differentiable Edge Masking with Ivan Titov
- Direct Multi-hop Attention based Graph Neural Network with Jure Leskovec
- Graph Neural Networks with Heterophily with Danai Koutra
- My Body is a Cage: the Role of Morphology in Graph-Based Incompatible Control with Shimon Whiteson
Survey
- Computing Graph Neural Networks: A Survey from Algorithms to Accelerators
Today at ArXiv: GNNs rescue NLP, power of random initialization, and a survey on computation of GNNs 🏭
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Towards Interpretable Reasoning over Paragraph Effects in Situation EMNLP 2020
- Double Graph Based Reasoning for Document-level Relation Extraction EMNLP 2020
- Neural Topic Modeling by Incorporating Document Relationship Graph EMNLP 2020
- GraphDialog: Integrating Graph Knowledge into End-to-End Task-Oriented Dialogue Systems EMNLP 2020
- Knowledge-Enhanced Personalized Review Generation with Capsule Graph Neural Network CIKM 2020
- Knowledge Graph Embeddings in Geometric Algebras COLING 2020
- TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation COLING 2020
GNNs
- The Surprising Power of Graph Neural Networks with Random Node Initialization
- Interpreting Graph Neural Networks for NLP With Differentiable Edge Masking with Ivan Titov
- Direct Multi-hop Attention based Graph Neural Network with Jure Leskovec
- Graph Neural Networks with Heterophily with Danai Koutra
- My Body is a Cage: the Role of Morphology in Graph-Based Incompatible Control with Shimon Whiteson
Survey
- Computing Graph Neural Networks: A Survey from Algorithms to Accelerators
RAPIDS cuGraph adds NetworkX and DiGraph Compatibility
Very exciting update to running graph algorithms on GPU. Huge speedups for typical algorithms (PageRank, SCC, etc.) and new algorithms (Louvain, Leiden, etc.) for graphs with thousands of vertices. The migration from networkx seems very smooth, so worth giving it a shot.
Very exciting update to running graph algorithms on GPU. Huge speedups for typical algorithms (PageRank, SCC, etc.) and new algorithms (Louvain, Leiden, etc.) for graphs with thousands of vertices. The migration from networkx seems very smooth, so worth giving it a shot.
Medium
RAPIDS cuGraph adds NetworkX and DiGraph Compatibility
RAPIDS cuGraph adds NetworkX Graph and DiGraph objects as valid input data types for graph algorithms
DataStart Conference 2020
There is a russian-speaking event DataStart (20 Oct) that includes presentations from the leading experts in the industry and academy in Russia. The speakers include Anton Tsitsulin who will talk about unsupervised graph embeddings and Valentin Malykh who will describe how you can use knowledge graphs for visualization in NLP.
There is a russian-speaking event DataStart (20 Oct) that includes presentations from the leading experts in the industry and academy in Russia. The speakers include Anton Tsitsulin who will talk about unsupervised graph embeddings and Valentin Malykh who will describe how you can use knowledge graphs for visualization in NLP.
datastart.ru
Бесплатная осенняя онлайн-конференция Data Science 2020
Обучающие конференции по Data Science в
Москве и Санкт-Петербурге. Программа мероприятий содержит актуальные темы по Big Data,
Machine Learning, AI. Практические занятия позволят лучше усвоить полученные
на мероприятии знания.
Москве и Санкт-Петербурге. Программа мероприятий содержит актуальные темы по Big Data,
Machine Learning, AI. Практические занятия позволят лучше усвоить полученные
на мероприятии знания.
GML Newsletter Issue #3
The third issue of GML newsletter is available! Blog posts, videos, past and future events.
The third issue of GML newsletter is available! Blog posts, videos, past and future events.
Graph Machine Learning research groups: Tina Eliassi-Rad
I do a series of posts on the groups in graph research, previous post is here. The 16th is Tina Eliassi-Rad, coauthor of Cora datasets that are still widely used in node classification benchmarks.
Tina Eliassi-Rad (~1974)
- Affiliation: Northeastern University
- Education: Ph.D. at University of Wisconsin-Madison in 2001 (advisor: Jude Shavlik)
- h-index 32
- Awards: best paper awards ICDM, CIKM; ISI fellow
- Interests: graph mining, anomaly detection, graph algorithms
I do a series of posts on the groups in graph research, previous post is here. The 16th is Tina Eliassi-Rad, coauthor of Cora datasets that are still widely used in node classification benchmarks.
Tina Eliassi-Rad (~1974)
- Affiliation: Northeastern University
- Education: Ph.D. at University of Wisconsin-Madison in 2001 (advisor: Jude Shavlik)
- h-index 32
- Awards: best paper awards ICDM, CIKM; ISI fellow
- Interests: graph mining, anomaly detection, graph algorithms
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Alejandro Ribeiro
I do a series of posts on the groups in graph research, previous post is here. The 15th is Alejandro Ribeiro, head of Alelab at UPenn and the leading author of the ongoing GNN course.
Alejandro…
I do a series of posts on the groups in graph research, previous post is here. The 15th is Alejandro Ribeiro, head of Alelab at UPenn and the leading author of the ongoing GNN course.
Alejandro…
NeurIPS 2020 Graph Papers
I counted 123 graph papers (attached) at NeurIPS 2020, which is 6.5% of all accepted papers. This repo provides a good categorization of graph papers into topics such as oversmoothing, adversarial attacks, expressive power, etc.
Also the plot shows number of accepted papers per "graph" authors, i.e. authors that at least have one graph paper at NeurIPS 2020.
I counted 123 graph papers (attached) at NeurIPS 2020, which is 6.5% of all accepted papers. This repo provides a good categorization of graph papers into topics such as oversmoothing, adversarial attacks, expressive power, etc.
Also the plot shows number of accepted papers per "graph" authors, i.e. authors that at least have one graph paper at NeurIPS 2020.