Fresh picks from ArXiv
Today at ArXiv: learning logic and simulations with GNN and a new practical guide to GNNs.
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text Generation EMNLP 2020
- Learning to Represent Image and Text with Denotation Graph EMNLP 2020
- TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion EMNLP 2020 with William L. Hamilton
- Embedding Words in Non-Vector Space with Unsupervised Graph Learning EMNLP 2020
- Unsupervised Joint k-node Graph Representations with Compositional Energy-Based Models with Bruno Ribeiro, NeurIPS 2020
- Dirichlet Graph Variational Autoencoder NeurIPS 2020
- RatE: Relation-Adaptive Translating Embedding for Knowledge Graph Completion COLING 2020
GNN
- Graph Convolutional Value Decomposition in Multi-Agent Reinforcement Learning
- High-Order Relation Construction and Mining for Graph Matching
- RNNLogic: Learning Logic Rules for Reasoning on Knowledge Graphs with Yoshua Bengio and Jian Tang
- Learning Mesh-Based Simulation with Graph Networks with Peter W. Battaglia
- Directional Graph Networks with William L. Hamilton
- Simplicial Neural Networks
Survey
- A Practical Guide to Graph Neural Networks
Today at ArXiv: learning logic and simulations with GNN and a new practical guide to GNNs.
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text Generation EMNLP 2020
- Learning to Represent Image and Text with Denotation Graph EMNLP 2020
- TeMP: Temporal Message Passing for Temporal Knowledge Graph Completion EMNLP 2020 with William L. Hamilton
- Embedding Words in Non-Vector Space with Unsupervised Graph Learning EMNLP 2020
- Unsupervised Joint k-node Graph Representations with Compositional Energy-Based Models with Bruno Ribeiro, NeurIPS 2020
- Dirichlet Graph Variational Autoencoder NeurIPS 2020
- RatE: Relation-Adaptive Translating Embedding for Knowledge Graph Completion COLING 2020
GNN
- Graph Convolutional Value Decomposition in Multi-Agent Reinforcement Learning
- High-Order Relation Construction and Mining for Graph Matching
- RNNLogic: Learning Logic Rules for Reasoning on Knowledge Graphs with Yoshua Bengio and Jian Tang
- Learning Mesh-Based Simulation with Graph Networks with Peter W. Battaglia
- Directional Graph Networks with William L. Hamilton
- Simplicial Neural Networks
Survey
- A Practical Guide to Graph Neural Networks
arXiv.org
Simplicial Neural Networks
We present simplicial neural networks (SNNs), a generalization of graph neural networks to data that live on a class of topological spaces called simplicial complexes. These are natural...
How random are peer reviews?
A new paper came out about the quality of the reviews at peer-review conferences that analyzed submissions at ICLR's OpenReview for the last 4 years. Here is what I found the most interesting.
* If an accepted paper were reviewed anew, would it be accepted a second time?
This is called reproducibility of reviews. In 2020, it's 66% which means 1 out of 3 times you'd get a reject even if your paper deserves acceptance. More to it, even if you increase the number of reviewers reproducibility will be around the same ~70%.
* Do final paper score correlates with how many citations it gets?
Yes, higher ranked papers get more citations. What's more interesting is how many more citations a paper gets just due to an exposure at the conference: the correlation doubles just because of the exposure at the venue.
* Is there a bias of affiliation, author reputation, or ArXiv in reviewers' scores?
Yes, but very small. For example, papers at Cornell get 0.58 boost of the score (out of 10). For Google and DeepMind there is no correlation between their score and acceptance rate compared to other papers. Same can be said about ArXiv availability of a paper or h-index of the authors.
A new paper came out about the quality of the reviews at peer-review conferences that analyzed submissions at ICLR's OpenReview for the last 4 years. Here is what I found the most interesting.
* If an accepted paper were reviewed anew, would it be accepted a second time?
This is called reproducibility of reviews. In 2020, it's 66% which means 1 out of 3 times you'd get a reject even if your paper deserves acceptance. More to it, even if you increase the number of reviewers reproducibility will be around the same ~70%.
* Do final paper score correlates with how many citations it gets?
Yes, higher ranked papers get more citations. What's more interesting is how many more citations a paper gets just due to an exposure at the conference: the correlation doubles just because of the exposure at the venue.
* Is there a bias of affiliation, author reputation, or ArXiv in reviewers' scores?
Yes, but very small. For example, papers at Cornell get 0.58 boost of the score (out of 10). For Google and DeepMind there is no correlation between their score and acceptance rate compared to other papers. Same can be said about ArXiv availability of a paper or h-index of the authors.
OpenReview
An Open Review of OpenReview: A Critical Analysis of the Machine...
Mainstream machine learning conferences have seen a dramatic increase in the number of participants, along with a growing range of perspectives, in recent years. Members of the machine learning...
NeurIPS 2020. Comprehensive analysis of authors, organizations, and countries.
This post analyzes what authors and organizations publish at NeurIPS 2020 this December, similar to the analysis I did for ICML 2020.
In addition to general insights (that I also found interesting), there are two collaboration graphs that I created, one between affiliations and one between authors. What's exciting is that these two networks are very different from each other, and the graph of authors is actually quite disconnected with lots of small groups of people (of size ~50) and large diameter (25 hops). Could be interesting in the future to understand why it's the case and what are these small groups of people.
This post analyzes what authors and organizations publish at NeurIPS 2020 this December, similar to the analysis I did for ICML 2020.
In addition to general insights (that I also found interesting), there are two collaboration graphs that I created, one between affiliations and one between authors. What's exciting is that these two networks are very different from each other, and the graph of authors is actually quite disconnected with lots of small groups of people (of size ~50) and large diameter (25 hops). Could be interesting in the future to understand why it's the case and what are these small groups of people.
Medium
NeurIPS 2020. Comprehensive analysis of authors, organizations, and countries.
What will happen at NeurIPS2020 this December? Top authors, affiliations, and countries at the biggest AI Conference of this year analyzed.
GML YouTube Videos
I was pleasantly surprised to see there is YouTube playlist by Zak Jost that covers some aspects of GNNs, including an interview with DeepMind authors for using GNNs for physics.
I was pleasantly surprised to see there is YouTube playlist by Zak Jost that covers some aspects of GNNs, including an interview with DeepMind authors for using GNNs for physics.
YouTube
Graph Neural Networks
A series on the basics of graph neural networks.
Fresh picks from ArXiv
Today at ArXiv: generalization of GNNs, faster graphlet kernels, and sunshine graphs ☀️
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
- Discriminability of Single-Layer Graph Neural Networks with Alejandro Ribeiro
- On Size Generalization in Graph Neural Networks with Haggai Maron
- Bi-GCN: Binary Graph Convolutional Network
- Fast Graph Kernel with Optical Random Features
- Scalable Graph Networks for Particle Simulations
- Disentangled Dynamic Graph Deep Generation
Math
- Supercards, Sunshines and Caterpillar Graphs
- An Extension of the Birkhoff-von Neumann Theorem to Non-Bipartite Graphs with Vijay V. Vazirani
Conferences
- Natural Language Rationales with Full-Stack Visual Reasoning: From Pixels to Semantic Frames to Commonsense Graphs EMNLP 2020
- Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks EMNLP
- A Graph Representation of Semi-structured Data for Web Question Answering COLING 2020
- STP-UDGAT: Spatial-Temporal-Preference User Dimensional Graph Attention Network for Next POI Recommendation CIKM 2020
- Enhancing Extractive Text Summarization with Topic-Aware Graph Neural Networks COLING 2020
Today at ArXiv: generalization of GNNs, faster graphlet kernels, and sunshine graphs ☀️
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
- Discriminability of Single-Layer Graph Neural Networks with Alejandro Ribeiro
- On Size Generalization in Graph Neural Networks with Haggai Maron
- Bi-GCN: Binary Graph Convolutional Network
- Fast Graph Kernel with Optical Random Features
- Scalable Graph Networks for Particle Simulations
- Disentangled Dynamic Graph Deep Generation
Math
- Supercards, Sunshines and Caterpillar Graphs
- An Extension of the Birkhoff-von Neumann Theorem to Non-Bipartite Graphs with Vijay V. Vazirani
Conferences
- Natural Language Rationales with Full-Stack Visual Reasoning: From Pixels to Semantic Frames to Commonsense Graphs EMNLP 2020
- Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks EMNLP
- A Graph Representation of Semi-structured Data for Web Question Answering COLING 2020
- STP-UDGAT: Spatial-Temporal-Preference User Dimensional Graph Attention Network for Next POI Recommendation CIKM 2020
- Enhancing Extractive Text Summarization with Topic-Aware Graph Neural Networks COLING 2020
CIKM 2020 stats
Dates: Oct 19-23
Where: Online
Price: €70
Graph papers can be found at paper digest.
• 970/397 full/short submissions (vs 1030/470 in 2019)
• 193/103 accepted (vs 202/107 in 2019)
• 20% / 26% acceptance rate (vs 19/21% in 2019)
• ~97 total graph papers (20% of total)
Dates: Oct 19-23
Where: Online
Price: €70
Graph papers can be found at paper digest.
• 970/397 full/short submissions (vs 1030/470 in 2019)
• 193/103 accepted (vs 202/107 in 2019)
• 20% / 26% acceptance rate (vs 19/21% in 2019)
• ~97 total graph papers (20% of total)
Open Catalyst Project
Facebook and CMU launch Open Catalyst Project, which contains the largest dataset for quantum chemistry predictions. The goal is to predict atomic interactions faster than quantum mechanical simulations (DFT), which could be translated as a graph regression task.
Facebook and CMU launch Open Catalyst Project, which contains the largest dataset for quantum chemistry predictions. The goal is to predict atomic interactions faster than quantum mechanical simulations (DFT), which could be translated as a graph regression task.
Facebook
Facebook and Carnegie Mellon launch the Open Catalyst Project to find new ways to store renewable energy
Facebook AI and the Carnegie Mellon University (CMU) Department of Chemical Engineering are announcing the Open Catalyst Project, a collaboration intended to use…
What’s the Difference Between an Ontology and a Knowledge Graph?
An easy explanation of the distinction between KGs and ontologies. In short, ontology + data = knowledge graph.
An easy explanation of the distinction between KGs and ontologies. In short, ontology + data = knowledge graph.
Enterprise Knowledge
What's the Difference Between an Ontology and a Knowledge Graph? - Enterprise Knowledge
Ontologies are generalized semantic data models, while a knowledge graph is what we get when we leverage that model and apply it to instance data.
Graph Machine Learning research groups: Jiliang Tang
I do a series of posts on the groups in graph research, previous post is here. The 17th is Jiliang Tang, coauthor of the book "Deep Learning on Graphs".
Jiliang Tang (~1974)
- Affiliation: Michigan State University
- Education: Ph.D. at Arizona State University in 2015 (advisor: Huan Liu)
- h-index 56
- Awards: best paper awards KDD, WSDM; Yahoo! awards; Distinguished Withrow Research Award; NSF Career Award
- Interests: graph neural networks, network analysis, anomaly detection on graphs
I do a series of posts on the groups in graph research, previous post is here. The 17th is Jiliang Tang, coauthor of the book "Deep Learning on Graphs".
Jiliang Tang (~1974)
- Affiliation: Michigan State University
- Education: Ph.D. at Arizona State University in 2015 (advisor: Huan Liu)
- h-index 56
- Awards: best paper awards KDD, WSDM; Yahoo! awards; Distinguished Withrow Research Award; NSF Career Award
- Interests: graph neural networks, network analysis, anomaly detection on graphs
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Tina Eliassi-Rad
I do a series of posts on the groups in graph research, previous post is here. The 16th is Tina Eliassi-Rad, coauthor of Cora datasets that are still widely used in node classification benchmarks.…
I do a series of posts on the groups in graph research, previous post is here. The 16th is Tina Eliassi-Rad, coauthor of Cora datasets that are still widely used in node classification benchmarks.…
Genesis Therapeutic — a startup working on GNN drug discovery
Launched in November 2019 out of Stanford’s Pande Lab, Genesis Therapeutics is researching GNNs and graph generative models in the field of drug discovery. They recently announced their partnership with Genentech, large biotech company, to test their ML platform for pharma.
Launched in November 2019 out of Stanford’s Pande Lab, Genesis Therapeutics is researching GNNs and graph generative models in the field of drug discovery. They recently announced their partnership with Genentech, large biotech company, to test their ML platform for pharma.
Fierce Biotech
Genentech taps Stanford University spinout for AI drug discovery partnership
Genentech has tapped an artificial intelligence startup spun out of Stanford University late last year to help it discover new drugs across multiple the | Genentech has tapped an artificial intelligence startup spun out of Stanford University late last year…
Fresh picks from ArXiv
Today at ArXiv: building graphs from pretrained language models, graph information bottleneck, and quantum entanglement ⚛️
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- XLVIN: eXecuted Latent Value Iteration Nets NeurIPS-DeepRL 2020, with Petar Veličković
- Learning to Execute Programs with Instruction Pointer Attention Graph Neural Networks NeurIPS 2020
- Graph Information Bottleneck NeurIPS 2020, with Jure Leskovec
- Beta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs NeurIPS 2020, with Jure Leskovec
- Graph Geometry Interaction Learning NeurIPS 2020
- Rethinking pooling in graph neural networks NeurIPS 2020
- Heterogeneous Hypergraph Embedding for Graph Classification WSDM 2021
- Contextual Heterogeneous Graph Network for Human-Object Interaction Detection ECCV-2020
Graphs
- A Differentiable Relaxation of Graph Segmentation and Alignment for AMR Parsing with Ivan Titov
- Graph and graphon neural network stability with Alejandro Ribeiro
- Language Models are Open Knowledge Graphs
- Can entanglement hide behind triangle-free graphs?
Survey
- Model Extraction Attacks on Graph Neural Networks: Taxonomy and Realization
Today at ArXiv: building graphs from pretrained language models, graph information bottleneck, and quantum entanglement ⚛️
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
- XLVIN: eXecuted Latent Value Iteration Nets NeurIPS-DeepRL 2020, with Petar Veličković
- Learning to Execute Programs with Instruction Pointer Attention Graph Neural Networks NeurIPS 2020
- Graph Information Bottleneck NeurIPS 2020, with Jure Leskovec
- Beta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs NeurIPS 2020, with Jure Leskovec
- Graph Geometry Interaction Learning NeurIPS 2020
- Rethinking pooling in graph neural networks NeurIPS 2020
- Heterogeneous Hypergraph Embedding for Graph Classification WSDM 2021
- Contextual Heterogeneous Graph Network for Human-Object Interaction Detection ECCV-2020
Graphs
- A Differentiable Relaxation of Graph Segmentation and Alignment for AMR Parsing with Ivan Titov
- Graph and graphon neural network stability with Alejandro Ribeiro
- Language Models are Open Knowledge Graphs
- Can entanglement hide behind triangle-free graphs?
Survey
- Model Extraction Attacks on Graph Neural Networks: Taxonomy and Realization
arXiv.org
Rethinking pooling in graph neural networks
Graph pooling is a central component of a myriad of graph neural network (GNN) architectures. As an inheritance from traditional CNNs, most approaches formulate graph pooling as a cluster...
Python Bindings of JGraphT
There is a new python binding of popular java library JGraphT, which is exciting news for those who want efficiency when working with graphs (in addition to other recent news Nvidia GPU-accelerated package).
JGraphT is a java library that contains very efficient and generic graph data-structures along with a large collection of state-of-the-art algorithms. What's great is that it allows to use easy interface with java bindings across all OS (including Windows) without installing JVM. JGraphT is known for its efficiency, reliability, and large collection of graph algorithms including pagerank, flows, cuts, vertex covers, colorings, isomorphism checking and more.
There is a new python binding of popular java library JGraphT, which is exciting news for those who want efficiency when working with graphs (in addition to other recent news Nvidia GPU-accelerated package).
JGraphT is a java library that contains very efficient and generic graph data-structures along with a large collection of state-of-the-art algorithms. What's great is that it allows to use easy interface with java bindings across all OS (including Windows) without installing JVM. JGraphT is known for its efficiency, reliability, and large collection of graph algorithms including pagerank, flows, cuts, vertex covers, colorings, isomorphism checking and more.
Medium
Announcing the Python Bindings of JGraphT
The JGraphT is a stable and mature graph library targeting the JVM ecosystem. It contains very efficient and generic graph data-structures…