Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Knowledge Graphs @ ICLR 2021

One and only Michael Galkin does it again with a superior digest of knowledge graph research at ICLR 2021. Topics include reasoning, temporal logics, and complex question answering in KGs: a lot of novel ideas and less SOTA-chasing work!
Fresh picks from ArXiv
This week on ArXiv: optimization properties of GNNs, review on sample-based approaches, and time zigzags for Ethereum price prediction ๐Ÿ’ฐ

If I forgot to mention your paper, please shoot me a message and I will update the post.

Conferences
* Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders ACL 2021
* Neural Graph Matching based Collaborative Filtering SIGIR 2021
* Z-GCNETs: Time Zigzags at Graph Convolutional Networks for Time Series Forecasting ICML 2021
* Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth ICML 2021

Efficiency
* Scalable Graph Neural Network Training: The Case for Sampling
* VersaGNN: a Versatile accelerator for Graph neural networks
New Proof Reveals That Graphs With No Pentagons Are Fundamentally Different

A new article at Quanta about Erdล‘sโ€“Hajnal conjecture, which states that any graph that forbids having some small subgraph will inevitably have a large clique or a large independent set. The article talks about a recent paper that confirms the conjecture for a special case which was deemed the hardest. Now there is a hope that the conjecture is true for the general case.
Constructions in combinatorics via neural networks

I have been fascinated about potential of using machine learning for combinatorial problems and have written multiple posts (here and here) and a survey about this. And as such it was exciting to see a work that applies RL framework to disprove several combinatorial conjectures.

The algorithm is very simple: generate many graphs with MLP, select the top-X of them, use cross-entropy to update MLP. So it does not use recent advances in RL, neither in GML to care about invariance of the input. So there is a room for improvement. Also it generates graphs of pre-determined size, so if a counterexample has a big order it would be difficult to know in advance. But it would be very interesting to apply this framework to more complicated conjectures such as reconstruction conjecture.
On Explainability of Graph Neural Networks via Subgraph Explorations

This is a guest post by Shuiwang Ji about their recent work, accepted to ICML 2021.

Title: "On Explainability of Graph Neural Networks via Subgraph Explorations"

TL; DR:
- We propose a novel method, known as SubgraphX, to explain GNNs by exploring and identifying important subgraphs.
- We propose to incorporate the Monte Carlo tree search to explore subgraphs and propose efficient approximation schemes to measure subgraphs via Shapley values.
- Our proposed method consistently and significantly outperforms state-of-the-art techniques.
Code is now available as part of our DIG library.

We study the explainability of Graph Neural Networks and propose a novel method (SubgraphX) to provide subgraph-level explanations. While existing methods mainly focus on explaining GNNs with graph nodes or edges, we argue that subgraphs are more intuitive and human-intelligible.

In our SubgraphX, we propose to explore different subgraphs with the Monte Carlo tree search. For each subgraph, we measure its importance using Shapley values, which can capture the interactions among different graph structures. We further improve the efficiency with our proposed approximation schemes to compute Shapley values for graph data. Both quantitative and qualitative studies show our method obtain higher-quality and more human-intelligible explanations while keeping time complexity acceptable.

Our method represents the first attempt to explain GNNs by explicitly studying the subgraphs. We hope that this work can provide a new direction for the community to investigate the explainability of GNNs in the future.
GraphDF: A Discrete Flow Model for Molecular Graph Generation

This is a guest post by Shuiwang Ji about their recent work, accepted to ICML 2021.

Title: โ€œGraphDF: A Discrete Flow Model for Molecular Graph Generationโ€

TL; DR:
- We propose GraphDF, a novel discrete latent variable model for molecular graph generation method.
- We propose to use invertible modulo shift transform to sequentially generate graph nodes and edges from discrete latent variables.
- Our proposed method outperforms prior methods on random generation, property optimization, and constrained optimization tasks.
Code is now available as part of our DIG library.

We study the molecular generation problem and propose a novel method (GraphDF) achieving new state-of-the-art performance. While prior methods use continuous latent variables, we argue that discrete latent variables are more suitable to model the categorical distribution of graph nodes and edges.

In our GraphDF, the molecular graph is generated by sequentially using modulo shift transform to convert a sampled discrete latent variable to the categorical number of the graph node or edge type. The use of discrete latent variables eliminates the bad effect of dequantization and models the underlying distribution of graph structures more accurately. The modulo shift transform captures conditional information from the last sub-graph by graph convolutional networks to ensure the order invariance. Comprehensive studies show that our method outperform prior methods on random generation, property optimization, and constrained optimization tasks.

Our method is the first work to model the density of complicated molecular graph data with discrete latent variables. We hope that it can provide a new insight for the community to explore more powerful graph generation models in the future.
Rethinking Graph Neural Architecture Search from Message-passing

With abundance of GNNs architectures it's natural to ask how to select the right architecture for your task. In a recent CVPR 2021 work propose a generic architecture that encompasses many existing GNNs, which is then optimized via gradient descent. After optimization resulted GNNs may get different architectures for each layer of GNNs.
Graph Machine Learning research groups: Yizhou Sun

I do a series of posts on the groups in graph research, previous post is here. The 28th is Yizhou Sun, a professor at UCLA, who co-authored a book on heterogeneous information networks.

Yizhou Sun (~1982)
- Affiliation: UCLA
- Education: Ph.D. at UIUC in 2012 (advisors: Jiawei Han)
- h-index 48
- Interests: heterogeneous information networks, self-supervised learning, community detection
- Awards: best research papers at KDD, ASONAM
Mathematicians Answer Old Question About Odd Graphs

A new post at Quanta about the work that settles the question (c. 1960s) of the biggest subgraph with all vertices having odd degree within that subgraph.
NAACL-2021 Papers

A list of accepted papers to NLP conference NAACL-2021 is available at digest console. There are ~40 graph papers out of 476 papers.
GNN User Group: meeting 5

Fifth meeting of GNN user group will include talks from:

* 4:00 - 4:25 (PST): Graphite: GRAPH-Induced feaTure Extraction for Point Cloud Registration (Mahdi Saleh, TUM).
* 4:25 - 4:50 (PST): Optimizing Graph Transformer Networks with Graph-based Techniques (Loc Hoang, University of Texas at Austin)
* 4:50 - 5:15 (PST): Encoding the Core Business Entities Using Meituan Brain (Mengdi Zhang, Meituan)
* 5:15 - 5:30 (PST): Open Discussion and Networking

Please join us today, 27 May! Zoom link in the description.
Reinforcement learning for combinatorial optimization: A survey

Our work that surveys recent RL methods for solving combinatorial optimization problems is accepted at Computers & Operations Research journal.

This is very active field right now and it shows a lot of promise. Traditionally, NP-hard problems such as Traveling Salesman Problem were solved by algorithms, that were designed specifically for each problem. With RL, it's possible to extend the toolbox by learning a function on available data. I really hope that in 10 years from now using ML approaches for combinatorial problems will be a commonplace.
Graph papers at ICML 2021

ICML 2021 papers are announced, here is some analysis on this.

There are about 58 graph papers (if I didn't mention your paper, let me know, I'll fix it).

The top authors are displayed.
Almost Free Inductive Embeddings Out-Perform Trained Graph Neural Networks in Graph Classification in a Range of Benchmarks

A nice blog post by Vadym Safronov (in Russian also here) which shows that you can use not-trained GCN to match or exceed performance of end-to-end trained GCN on graph classification benchmarks.
Graph Machine Learning research groups: Gal Chechik

I do a series of posts on the groups in graph research, previous post is here. The 29th is Gal Chechik, a professor at the Gonda Brain research institute and a director of AI at NVIDIA in Israel.

Gal Chechik (~1976)
- Affiliation: Bar Ilan University, Israel; NVIDIA
- Education: Ph.D. at Hebrew University, Israel in 2004 (advisors: Naftali Tishby and Israel Nelken)
- h-index 37
- Interests: biological systems, theory of GNNs, equivariant functions.
- Awards: best papers at ICML, ISMB; fullbright fellowship, Alon fellowship