Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Some notes on visualization of the graphs

Stephen Wolfram, creator of Wolfram language, recently made a post Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful, where he discusses possible origins of how the universe operates. I think the crux of his idea is that if you consider interactions between objects as a graph and then say how from some interactions appear new interactions you can get beautifully looking graphs that look like some 3D shapes which can represent our universe in the limit and therefore you can analyze properties of these graphs such as diameter or curvature to find equivalent notions in physics.

I won't speculate whether this post is theoretically-sound or not, let physicists debate, in the end any new theory should predict new facts which we need to wait, but one thing that is noticeable is that graphs that are drawn by Wolfram are beautiful. If you try to draw some big graphs you find that it very hard to draw it so that it does not look like a mess, but here you get pretty-looking networks that indeed remind you some known 3D shapes. Wolfram language has many layouts to draw graphs, which results in different images of the graph. From the shapes of the graphs in the post, it seems that he used SpectralEmbedding or SpringElectricalEmbedding layout. Daniel Spielman, professor at Yale and twice Turing award winner, has a nice popsci video where he discusses how these drawings are related to spectral graph theory and the conditions on the adjacency matrix to have a nice drawing. So maybe next time you will use some of these layouts to impress reviewers of your paper.
Discrete Differential Geometrical Course

CS 15-458/858: Discrete Differential Geometry (Spring 2020) at Carnegie Mellon University. The lectures are available at YouTube. Discussions of Laplace operator, smooth and discrete surfaces, curvatures, etc.
April Arxiv: how many graphs papers?

From 18 March to 17 April there were 300 new and 108 updated papers in ArXiv CS section. This is around 50 papers less that in the previous period.
Web Conference 2020

This week, fully-virtual, the Web Conference 2020 will take place. It will last for 5 days, you can still register (~200 USD).

There are tracks on social networks, semantics (KG), and user modeling, which often deal with graphs.
About every third paper is on graphs.

There will be 4 tutorials and 2 workshops on graphs (Monday-Tuesday), which I described in this post.
A forgotten story of Soviet AI

I found out about Weisfeiler-Leman algorithm about 5 years ago, and then sometime after I realized that both authors were from the USSR. That was quite unexpected. I started looking up information about the authors and found quite a good biography of Boris Weisfeiler, written by his sister, and not so much about Andrey Leman. About one year I was searching the people who knew him, one by one, who are now quite senior and don't use all fancy messengers, to find out more about his life. Finally, I gathered enough to write a post on his life, from interest in math olympiads to development of the first AI chess player, to working in Silicon Valley.

His life is a symbol of generation of mathematicians of his time. Strong performance in math olympiads, competitive Moscow State University, working in the Institute of theoretical and experimental physics, and then emigration to the West, when the iron curtain collapsed. I like hearing these stories because it's reminiscent of stories of my parents and their friends-engineers. It's the voice of that time, that now is inevitably gone. Similar to the trip of Babai to the USSR, reading about these stories uncovers the foundations of graph theory, computer science and artificial intelligence that we study today and let us connect the dots between old and new approaches.
Geometric and Relational Deep Learning

A workshop on GML will be streamed tomorrow on YouTube. It will start at 9-20 and continue until 17-00. The list of speakers is great:

Peter Battaglia, DeepMind
Natalia Neverova, Facebook AI Research (FAIR)
Stephan GΓΌnnemann, TU Munich
Yaron Lipman, Weizmann Institute of Science
Miltos Allamanis, Microsoft Research
Qi Liu, University of Oxford
Pim de Haan, University of Amsterdam & Qualcomm AI Research
Noemi Montobbio, Italian Institute of Technology
Geometric and Relational Deep Learning Pt. 2

Apparently in this workshop there will be also poster sessions that are only available to registered participants. The list of the papers can be found below (thanks to people who attend it).

β€’ A geometric deep learning model to filter out anatomically non plausible fibers from tractograms [video]

β€’ Patient-Specific Pathological Gait Modelling with Conditional-NRI [video]

β€’ GRASP: Graph Alignment through Spectral Signatures

β€’ Isomorphism Leakage in Multi-Interaction Datasets [video]

β€’ Integrating Spectral and Spatial Domain Graph Neural Networks

β€’ Unshackling Bisimulation with Graph Neural Networks

β€’ State2vec: Learning Off-Policy State Representations [video]

β€’ Are Graph Convolutional Networks Fully Exploiting Graph Structure? [video]

β€’ Principal Neighbourhood Aggregation Networks [video]

β€’ Attentive Group Equivariant Convolutional Networks [video]

β€’ SMP: An Equivariant Message Passing Scheme for Learning Graph Structural Information [video]

β€’ Evaluation of Molecular Fingerprints for Similarity-based Virtual Screening generated through Graph Convolution Networks [video]

β€’ Network alignment with GNN [video]

β€’ Learning Generative Models across Incomparable Spaces

β€’ Learning Set Operations for Deformable Shapes [video]

β€’ Instant recovery of shape from spectrum via latent space connections [video]

β€’ SIGN: Scalable Inception Graph Neural Networks [video]

β€’ Universal Invariant and Equivariant Graph Neural Networks [video]

β€’ Graph Convolutional Gaussian Processes for Link Prediction [video]

β€’ Deep Graph Mapper: Seeing Graphs through the Neural Lens [video]

β€’ Geoopt: Riemannian Optimization in PyTorch [video]

β€’ HyperLearn: A Distributed Approach for Representation Learning in Datasets With Many Modalities

β€’ Multi-relational PoincarΓ© Graph Embeddings [video]

β€’ On Understanding Knowledge Graph Representation [video]

β€’ Learning Object-Object Relations in Video [video]
Graph Machine Learning research groups: Philip S. Yu

I do a series of posts on the groups in graph research. The fourth is Philip S. Yu. Fun fact: in 2019 he has 136 papers in google scholar (approximately one paper every 2.5 days).

Philip S. Yu (~1952)
- Affiliation: University of Illinois at Chicago;
- Education: Ph.D. at Stanford in 1978; MBA at NYU in 1982;
- h-index: 161;
- Awards: awards at KDD, IEEE CS, ICDM;
- Interests: data mining, anomaly detection on graphs, graph surveys, GNN.
ICLR 2020

ICLR conference starts this Sunday with workshops, followed by 4 days of main conference. For those who registered, you can enter ICLR portal where all conference will take place (videos, chats, keynotes, etc.). They projected the papers with t-SNE so that you can quickly find relevant papers.
Finally, if you are interested in graph machine learning, revisit the post that I did early this year on Top Trends of Graph Machine Learning in 2020 that is based on 150 graph papers submitted/accepted to ICLR.
ICLR 2020, Day 1

Here is a list of interesting papers (and their links to ICLR portal) of Day 1.

1. On Universal Equivariant Set Networks portal link
2. Hoppity: Learning Graph Transformations to Detect and Fix Bugs in Programs portal link
3. GraphSAINT: Graph Sampling Based Inductive Learning Method portal link
4. Measuring and Improving the Use of Graph Information in Graph Neural Networks portal link
5. Deep Double Descent: Where Bigger Models and More Data Hurt portal link

6. GraphAF: a Flow-based Autoregressive Model for Molecular Graph Generation portal link
7. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
8. You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings portal link
9. Deep Graph Matching Consensus portal link
10. PairNorm: Tackling Oversmoothing in GNNs portal link

11. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
12. Making Efficient Use of Demonstrations to Solve Hard Exploration Problems portal link
13. LambdaNet: Probabilistic Type Inference using Graph Neural Networks portal link
14. StructPool: Structured Graph Pooling via Conditional Random Fields portal link
15. Implementation Matters in Deep RL: A Case Study on PPO and TRPO portal link
ICLR 2020, Day 2

Day 1 was great, each paper has a prerecorded 5-minute video and 2 slots when you can ask questions through Zoom. Very convenient.

Here is a list of interesting papers of Day 2.

1. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
2. Probability Calibration for Knowledge Graph Embedding Models portal link
3. Learning to Guide Random Search portal link
4. Directional Message Passing for Molecular Graphs portal link
5. Locally Constant Networks portal link

6. Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data portal link
7. On the Weaknesses of Reinforcement Learning for Neural Machine Translation portal link
8. Scale-Equivariant Steerable Networks portal link
9. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification portal link
10. Learning Heuristics for Quantified Boolean Formulas through Reinforcement Learning portal link

11. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
12. Memory-Based Graph Networks portal link
13. Are Transformers universal approximators of sequence-to-sequence functions? portal link
14. GLAD: Learning Sparse Graph Recovery portal link
15. Hyper-SAGNN: a self-attention based graph neural network for hypergraphs portal link

16. The Curious Case of Neural Text Degeneration portal link
17. Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering portal link
18. Global Relational Models of Source Code portal link
ICML 2020 Workshops

I don't know why it's so hard to find workshops for ICML, especially that deadlines for submissions are the end of May, but here is a full list.

There are two particular graph workshops
Graph Representation Learning and Beyond (GRL+)
and Bridge Between Perception and Reasoning:Graph Neural Networks & Beyond. The first is more on graph representations, the latter is more on reasoning by using graph models, but they seem to overlap quite a lot.
ICLR 2020, Day 3

Day 3 has posters for Reformer πŸ€–, theory for GNN πŸ“š, deep learning for mathematics ✍️, and much more. Check out these papers.

1. Reformer: The Efficient Transformer portal link
2. Graph Neural Networks Exponentially Lose Expressive Power for Node Classification portal link
3. Neural Execution of Graph Algorithms portal link
4. Mathematical Reasoning in Latent Space portal link
5. Deep Learning For Symbolic Mathematics portal link

6. Graph Convolutional Reinforcement Learning portal link
7. Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation portal link
8. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings portal link
9. A Fair Comparison of Graph Neural Networks for Graph Classification portal link
10. Inductive representation learning on temporal graphs portal link
11. Inductive and Unsupervised Representation Learning on Graph Structured Objects portal link
List of open, simple, computational problems

There is a cool recent thread on MathOverflow on the open problems in Computer Science that anyone can comprehend (thanks to Alex). This is intriguing topic for me as I think that many math problems of 20th century can be solved with smart computations in 21st century.

There are quite a few problems on graphs, such as finding Moore graph or regular graphs. Besides this thread, there was an old similar thread in MathOverflow, where also a number of graph theory problems were posed. At last, in Open Problem Garden, there are all sorts of conjectures for graph theory that I believe can be much advanced by graph machine learning.
ICLR 2020, Day 4

The final day of ICLR 2020. I promise. You can unmute this channel now.

1. What graph neural networks cannot learn: depth vs width portal link
2. The Logical Expressiveness of Graph Neural Networks portal link
3. Reinforced Genetic Algorithm Learning for Optimizing Computation Graphs portal link
4. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations portal link
5. Contrastive Learning of Structured World Models portal link

6. GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding portal link
7. An Inductive Bias for Distances: Neural Nets that Respect the Triangle Inequality portal link
8. Learning deep graph matching with channel-independent embedding and Hungarian attention portal link
9. On the Equivalence between Positional Node Embeddings and Structural Graph Representations portal link