Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
ICLR 2020, Day 1

Here is a list of interesting papers (and their links to ICLR portal) of Day 1.

1. On Universal Equivariant Set Networks portal link
2. Hoppity: Learning Graph Transformations to Detect and Fix Bugs in Programs portal link
3. GraphSAINT: Graph Sampling Based Inductive Learning Method portal link
4. Measuring and Improving the Use of Graph Information in Graph Neural Networks portal link
5. Deep Double Descent: Where Bigger Models and More Data Hurt portal link

6. GraphAF: a Flow-based Autoregressive Model for Molecular Graph Generation portal link
7. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
8. You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings portal link
9. Deep Graph Matching Consensus portal link
10. PairNorm: Tackling Oversmoothing in GNNs portal link

11. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
12. Making Efficient Use of Demonstrations to Solve Hard Exploration Problems portal link
13. LambdaNet: Probabilistic Type Inference using Graph Neural Networks portal link
14. StructPool: Structured Graph Pooling via Conditional Random Fields portal link
15. Implementation Matters in Deep RL: A Case Study on PPO and TRPO portal link
ICLR 2020, Day 2

Day 1 was great, each paper has a prerecorded 5-minute video and 2 slots when you can ask questions through Zoom. Very convenient.

Here is a list of interesting papers of Day 2.

1. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
2. Probability Calibration for Knowledge Graph Embedding Models portal link
3. Learning to Guide Random Search portal link
4. Directional Message Passing for Molecular Graphs portal link
5. Locally Constant Networks portal link

6. Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data portal link
7. On the Weaknesses of Reinforcement Learning for Neural Machine Translation portal link
8. Scale-Equivariant Steerable Networks portal link
9. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification portal link
10. Learning Heuristics for Quantified Boolean Formulas through Reinforcement Learning portal link

11. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
12. Memory-Based Graph Networks portal link
13. Are Transformers universal approximators of sequence-to-sequence functions? portal link
14. GLAD: Learning Sparse Graph Recovery portal link
15. Hyper-SAGNN: a self-attention based graph neural network for hypergraphs portal link

16. The Curious Case of Neural Text Degeneration portal link
17. Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering portal link
18. Global Relational Models of Source Code portal link
ICML 2020 Workshops

I don't know why it's so hard to find workshops for ICML, especially that deadlines for submissions are the end of May, but here is a full list.

There are two particular graph workshops
Graph Representation Learning and Beyond (GRL+)
and Bridge Between Perception and Reasoning:Graph Neural Networks & Beyond. The first is more on graph representations, the latter is more on reasoning by using graph models, but they seem to overlap quite a lot.
ICLR 2020, Day 3

Day 3 has posters for Reformer πŸ€–, theory for GNN πŸ“š, deep learning for mathematics ✍️, and much more. Check out these papers.

1. Reformer: The Efficient Transformer portal link
2. Graph Neural Networks Exponentially Lose Expressive Power for Node Classification portal link
3. Neural Execution of Graph Algorithms portal link
4. Mathematical Reasoning in Latent Space portal link
5. Deep Learning For Symbolic Mathematics portal link

6. Graph Convolutional Reinforcement Learning portal link
7. Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation portal link
8. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings portal link
9. A Fair Comparison of Graph Neural Networks for Graph Classification portal link
10. Inductive representation learning on temporal graphs portal link
11. Inductive and Unsupervised Representation Learning on Graph Structured Objects portal link
List of open, simple, computational problems

There is a cool recent thread on MathOverflow on the open problems in Computer Science that anyone can comprehend (thanks to Alex). This is intriguing topic for me as I think that many math problems of 20th century can be solved with smart computations in 21st century.

There are quite a few problems on graphs, such as finding Moore graph or regular graphs. Besides this thread, there was an old similar thread in MathOverflow, where also a number of graph theory problems were posed. At last, in Open Problem Garden, there are all sorts of conjectures for graph theory that I believe can be much advanced by graph machine learning.
ICLR 2020, Day 4

The final day of ICLR 2020. I promise. You can unmute this channel now.

1. What graph neural networks cannot learn: depth vs width portal link
2. The Logical Expressiveness of Graph Neural Networks portal link
3. Reinforced Genetic Algorithm Learning for Optimizing Computation Graphs portal link
4. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations portal link
5. Contrastive Learning of Structured World Models portal link

6. GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding portal link
7. An Inductive Bias for Distances: Neural Nets that Respect the Triangle Inequality portal link
8. Learning deep graph matching with channel-independent embedding and Hungarian attention portal link
9. On the Equivalence between Positional Node Embeddings and Structural Graph Representations portal link
Thoughts from the first virtual conference

I had nice experience from virtual ICLR 2020. Most of the poster sessions were empty, which allowed me to bother authors with questions. Each paper had two slots during the day, so that I can definitely attend it. Chat allowed finding attendees quite easily, something that I had difficulty with real conferences. So it was much more valuable based on the insights that I gained than in real conference. But I didn't present and can understand that other people didn't get what they wanted.

By the way organizers, promised to make the portal available to everyone soon.

Now, here are some insights from the papers that I gained.

1) Topic on theoretical explanation of GNNs is hot. We now know some problems that can be approximated with GNN, functions that GNN can compute, limitations of GNN. [paper 1, paper 2, paper 3, paper 4]

2) One emerging topic is to teach GNN to learn algorithms, instead of doing classification task. Here be dragons. [paper 1, paper 2]

3) GNN are used to represent programs and equations. So potentially you can prove theorems with it. [paper 1, paper 2, paper 3, paper 4, paper 5]
Videos from Geometric and Relational Deep Learning Workshop

Videos are available from the workshop. Two my favorites are:

* Peter Battaglia: Learning Physics with Graph Neural Networks [video]

* Yaron Lipman: Deep Learning of Irregular and Geometric Data [video]
ICLR 2020 Recordings

All recordings for papers and workshops are now available to everyone!
KDD 2020: Workshop on Deep Learning on Graphs

If you miss ICML deadlines, there is another good workshop for GML at KDD.
Deadline: 15 June
5 pages, double-blind
Graph Representation Learning for Algorithmic Reasoning

Another idea coming more frequently in recent graph papers is to learn particular graph algorithm such as Bellman-Ford or Breadth-First Search, instead of doing node classification or link prediction. Here is a video from WebConf'20 by Petar VeličkoviΔ‡ (DeepMind) motivating this approach.
Graph Machine Learning research groups: Michael Bronstein

I do a series of posts on the groups in graph research. The fifth is Michael Bronstein. He founded a company Fabula AI on detecting fake news in social networks, which was acquired by Twitter. Also, he was a committee member of my PhD defense πŸ™‚

Michael Bronstein (1980)
- Affiliation: Imperial College London; Twitter
- Education: Ph.D. at Israel Institute of Technology in Israel in 2007 (supervised by Ron Kimmel);
- h-index: 61;
- Awards: IEEE and IARP fellow, Dalle Molle prize, Royal Society Wolfson Merit award;
- Interests: computer graphics, geometrical deep learning, graph neural networks.
Max Welling Talk GNN

I recently thought about what are other types of GNN exist beyond message-passing. I think one of them can be equivariant networks, i.e. neural networks that have permutation-equivariant properties, but I think there are other possible powerful graph models that are yet to be discovered.

In this video, Max Welling discusses his recent works on equivariant NNs for meshes and factor GNNs.
Introduction to Deep Learning (I2DL)

There is a course on deep learning by Technical University of Munich. Recordings, slides, and exercises are available online.
Secrets of the Surface: The Mathematical Vision of Maryam Mirzakhani

There is a documentary that you can watch on the life of Maryam Mirzakhani. In 2014, she was awarded the Fields medal for the work in "the dynamics and geometry of Riemann surfaces and their moduli spaces." You can read about her in this article. For the film, you can register here and they will send a link to the Vimeo, which will be available until 19th May.
AI and Theorem Proving

One of the topics that caught my attention was on using AI to automate theorem proving. Apparently, there is already a conference on this. At ICLR there was a paper on using graph networks for theorem proving.

I think besides this conference, which mainly explores how you can model mathematical logic using embeddings, another type of theorem proving is on smart pruning of combinatorial spaces (e.g. you have large space of graphs, from which you need to pick some particular examples).