Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
CS 520 Knowledge Graphs Video Materials

It seems that they will make the lectures available on YouTube 👍
Fresh picks from ArXiv
This week is full of CVPR papers with graphs, surveys on recommendation, and a historical note of the discovery of TSP algorithm is USSR 👨‍🚀

CVPR
HOPE-Net: A Graph-based Model for Hand-Object Pose Estimation
Graph Structured Network for Image-Text Matching
Spatio-Temporal Graph for Video Captioning with Knowledge Distillation
Multi-Modal Graph Neural Network for Joint Reasoning on Vision and Scene Text
Disentangling and Unifying Graph Convolutions for Skeleton-Based Action Recognition
Deep Homography Estimation for Dynamic Scenes
Iterative Context-Aware Graph Inference for Visual Dialog
Geometrically Principled Connections in Graph Neural Networks with Michael Bronstein
LiDAR-based Online 3D Video Object Detection with Graph-based Message Passing and Spatiotemporal Transformer Attention

WebConf
Graph Enhanced Representation Learning for News Recommendation

Survey
Deep Learning on Knowledge Graph for Recommender System: A Survey
Heterogeneous Network Representation Learning: Survey, Benchmark, Evaluation, and Beyond — essentially a survey on KG
A historical note on the 3/2-approximation algorithm for the metric traveling salesman problem — show that Christofides algorithm used for TSP was discovered in USSR around the same time
A Survey on Conversational Recommender Systems

Graph Theory
Longest paths in random hypergraphs
A Fast Algorithm for the Product Structure of Planar Graphs
Counterexamples to "A Conjecture on Induced Subgraphs of Cayley Graphs"
RL for TSP

There is already a rich literature on applying RL methods for Traveling Salesman Problem (TSP). In one line of research the solution is built incrementally, one node at a time. This is for example in a popular paper, Attention, Learn to Solve Routing Problems!. Recently another approach appeared: start with some solution to TSP and update this solution by swapping nodes. For example, Learning 2-opt Heuristics for the Traveling Salesman Problem via Deep Reinforcement Learning uses policy gradient method and shows that it significantly outperforms incremental approaches.
Graph Representation Learning Workshop ICML 20

One of the places that gather a lot of researchers in GML is the workshop on Graph Representation Learning. This year it will appear at ICML 20. You can submit short papers (4 pages), including the works that you can resubmit later to a full conference elsewhere. This year there will be a separate track on using graphs for COVID-19 resolution. Deadline: 29th May.
Some notes on reconstruction conjecture

Studying reconstruction conjecture is going down the rabbit hole 🐇 It seems so simple, but after a few steps its complexity unfolds and you find yourself reading another paper that provides some extra insights which only steer you away from the target.

Formulation. You have a graph. You delete one vertex from it, and put obtained subgraph to a list, called deck. Then you return to the original graph, delete another vertex and again add obtained subgraph to the deck. And so on, for all n different vertices. The main question is whether there are two non-isomorphic graphs that have the same deck.

Difficulty. I already wrote about this conjecture, showing that if you delete more than 1 vertex then there are indeed two different graphs with the same decks. That's the fact that you delete only one vertex makes it more difficult. Now think, you have n subgraphs, each has (n-1) original vertices, it seems you just have so much information as an input, O(n^3), to say everything about the original graph, which has only O(n^2). But no, it's still very hard to solve it.

So what if we have a deck instead and try to recover original graph. Can we do it fast? I'm maybe wrong, but it seems there is no known polynomial-time algorithms and even the complexity of this problem is not clear. This paper, for specific type of graphs, shows an algorithm O(n^8), which is the biggest factor I've seen for poly-time algorithms. This paper studies some similar problems related to reconstruction conjecture, showing that just checking that a graph has specific deck is already equivalent to graph isomorphism. Finally, there is a manual 1991 on all major achievements to date showing that reconstruction conjecture is solved for some families of graphs such as regular graphs.

Insights. After reading all this, I have the following thoughts.

* It would be cool to write a paper on the algorithm for reconstruction of the graph from its deck. I have some trivial idea how to do it via graph isomorphism solver and some heuristics. Actually the problem is not purely abstract and has a lot of important applications to chemoinformatics for example, so it's interesting.

* It would be cool to write a follow-up paper which uses RL to improve heuristics. Graph reconstruction is under radar compared to TSP, so it would be first-of-its-kind RL algorithm for such problem.

* Initially I had the idea that if I take hard instances for graph isomorphism and compute decks for such graphs then I would get the same decks for non-isomorphic graphs, solving reconstruction conjecture. The reason why I thought this was that hard non-isomorphic graphs are very similar to each other, so their subgraphs would be identical. But, regular graphs are already proved to be reconstructible, so many hard instances won't work (such as SRG or Miyazaki graphs). Then, there are "very hard" graphs, which are half-regular (there are only two values for degrees across all nodes). I tried those, but found that decks are completely different for non-isomorphic graphs. So my idea failed and I spent two days reading, thinking, and implementing it 😀
Graph Machine Learning research groups: Karsten Borgwardt

I do a series of posts on the groups in graph research. The third is Karsten Borgwardt. His group at ETH Zurich is working on computational biology and his lab is well known for development of some of commonly used graph kernels (e.g. Weisfeiler-Leman, graphlet, shortest path).

Karsten Borgwardt (1980)
- Affiliation: ETH Zurich;
- Education: Ph.D. at Ludwig-​Maximilians-University in Munich in 2007 (supervised by Hans-​Peter Kriegel);
- h-index: 47;
- Awards: Krupp Award, best papers at NeurIPS;
- Interests: graph kernels; molecular graph representation; computational biology
Online Math Seminars

There is a list of upcoming math seminars around the world, that are due to the confinement will be organized online and can be accessed by anyone. Topics are broad, including combinatorics, group theory and graphs.
KDD Cup 2020 AutoGraph Challenge

There is a kaggle-like challenge until 25th May on automl for graph embeddings. 5 datasets, $33.5K prize pool, node classification task, accuracy metric, time constraints on train and predict.
Discovering automorphism of a graph and its deck.

My hypothesis was that if you take some hard graph for graph isomorphism problem and remove one vertex, then the resulted graph will be much easier because the symmetry of the original graph will be broken. So I took the hardest known graphs for graph isomorphism and checked how much time it takes for determining automorphism group (which is similar to how hard it would be to run isomorphism test).

The results are quite interesting. Indeed for many subgraphs determining automorphism is 50-100 times easier. But surprisingly, there are subgraphs which are harder than the original graph.

In the image below, you can see results for 6 graphs from cfi-rigid-z2 with ~1000-2000 vertices and checked the runtime for the original graph and all possible subgraphs by deleting one vertex. You can see that while for the first two graphs (first row), all subgraphs are easier, for the next 4 graphs, there are smaller subgraphs that take 5x more time to solve that the original graph.

This could happen for three reasons: (1) nauty solver has some heuristics that worked better for the original graph than for the subgraphs (2) not stable running and rerunning it would result in different runtimes and (3) smaller graphs somehow indeed harder than the original graph. I think (2) is very unlikely and my guess is a combination of (1) and (3): removing a vertex makes equivalent vertices in the original graph to be non-equivalent in the subgraph which reduces the amount of pruning nauty does.
Some notes on visualization of the graphs

Stephen Wolfram, creator of Wolfram language, recently made a post Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful, where he discusses possible origins of how the universe operates. I think the crux of his idea is that if you consider interactions between objects as a graph and then say how from some interactions appear new interactions you can get beautifully looking graphs that look like some 3D shapes which can represent our universe in the limit and therefore you can analyze properties of these graphs such as diameter or curvature to find equivalent notions in physics.

I won't speculate whether this post is theoretically-sound or not, let physicists debate, in the end any new theory should predict new facts which we need to wait, but one thing that is noticeable is that graphs that are drawn by Wolfram are beautiful. If you try to draw some big graphs you find that it very hard to draw it so that it does not look like a mess, but here you get pretty-looking networks that indeed remind you some known 3D shapes. Wolfram language has many layouts to draw graphs, which results in different images of the graph. From the shapes of the graphs in the post, it seems that he used SpectralEmbedding or SpringElectricalEmbedding layout. Daniel Spielman, professor at Yale and twice Turing award winner, has a nice popsci video where he discusses how these drawings are related to spectral graph theory and the conditions on the adjacency matrix to have a nice drawing. So maybe next time you will use some of these layouts to impress reviewers of your paper.
Discrete Differential Geometrical Course

CS 15-458/858: Discrete Differential Geometry (Spring 2020) at Carnegie Mellon University. The lectures are available at YouTube. Discussions of Laplace operator, smooth and discrete surfaces, curvatures, etc.
April Arxiv: how many graphs papers?

From 18 March to 17 April there were 300 new and 108 updated papers in ArXiv CS section. This is around 50 papers less that in the previous period.
Web Conference 2020

This week, fully-virtual, the Web Conference 2020 will take place. It will last for 5 days, you can still register (~200 USD).

There are tracks on social networks, semantics (KG), and user modeling, which often deal with graphs.
About every third paper is on graphs.

There will be 4 tutorials and 2 workshops on graphs (Monday-Tuesday), which I described in this post.
A forgotten story of Soviet AI

I found out about Weisfeiler-Leman algorithm about 5 years ago, and then sometime after I realized that both authors were from the USSR. That was quite unexpected. I started looking up information about the authors and found quite a good biography of Boris Weisfeiler, written by his sister, and not so much about Andrey Leman. About one year I was searching the people who knew him, one by one, who are now quite senior and don't use all fancy messengers, to find out more about his life. Finally, I gathered enough to write a post on his life, from interest in math olympiads to development of the first AI chess player, to working in Silicon Valley.

His life is a symbol of generation of mathematicians of his time. Strong performance in math olympiads, competitive Moscow State University, working in the Institute of theoretical and experimental physics, and then emigration to the West, when the iron curtain collapsed. I like hearing these stories because it's reminiscent of stories of my parents and their friends-engineers. It's the voice of that time, that now is inevitably gone. Similar to the trip of Babai to the USSR, reading about these stories uncovers the foundations of graph theory, computer science and artificial intelligence that we study today and let us connect the dots between old and new approaches.
Geometric and Relational Deep Learning

A workshop on GML will be streamed tomorrow on YouTube. It will start at 9-20 and continue until 17-00. The list of speakers is great:

Peter Battaglia, DeepMind
Natalia Neverova, Facebook AI Research (FAIR)
Stephan Günnemann, TU Munich
Yaron Lipman, Weizmann Institute of Science
Miltos Allamanis, Microsoft Research
Qi Liu, University of Oxford
Pim de Haan, University of Amsterdam & Qualcomm AI Research
Noemi Montobbio, Italian Institute of Technology
Geometric and Relational Deep Learning Pt. 2

Apparently in this workshop there will be also poster sessions that are only available to registered participants. The list of the papers can be found below (thanks to people who attend it).

• A geometric deep learning model to filter out anatomically non plausible fibers from tractograms [video]

• Patient-Specific Pathological Gait Modelling with Conditional-NRI [video]

• GRASP: Graph Alignment through Spectral Signatures

• Isomorphism Leakage in Multi-Interaction Datasets [video]

• Integrating Spectral and Spatial Domain Graph Neural Networks

• Unshackling Bisimulation with Graph Neural Networks

• State2vec: Learning Off-Policy State Representations [video]

• Are Graph Convolutional Networks Fully Exploiting Graph Structure? [video]

• Principal Neighbourhood Aggregation Networks [video]

• Attentive Group Equivariant Convolutional Networks [video]

• SMP: An Equivariant Message Passing Scheme for Learning Graph Structural Information [video]

• Evaluation of Molecular Fingerprints for Similarity-based Virtual Screening generated through Graph Convolution Networks [video]

• Network alignment with GNN [video]

• Learning Generative Models across Incomparable Spaces

• Learning Set Operations for Deformable Shapes [video]

• Instant recovery of shape from spectrum via latent space connections [video]

• SIGN: Scalable Inception Graph Neural Networks [video]

• Universal Invariant and Equivariant Graph Neural Networks [video]

• Graph Convolutional Gaussian Processes for Link Prediction [video]

• Deep Graph Mapper: Seeing Graphs through the Neural Lens [video]

• Geoopt: Riemannian Optimization in PyTorch [video]

• HyperLearn: A Distributed Approach for Representation Learning in Datasets With Many Modalities

• Multi-relational Poincaré Graph Embeddings [video]

• On Understanding Knowledge Graph Representation [video]

• Learning Object-Object Relations in Video [video]
Graph Machine Learning research groups: Philip S. Yu

I do a series of posts on the groups in graph research. The fourth is Philip S. Yu. Fun fact: in 2019 he has 136 papers in google scholar (approximately one paper every 2.5 days).

Philip S. Yu (~1952)
- Affiliation: University of Illinois at Chicago;
- Education: Ph.D. at Stanford in 1978; MBA at NYU in 1982;
- h-index: 161;
- Awards: awards at KDD, IEEE CS, ICDM;
- Interests: data mining, anomaly detection on graphs, graph surveys, GNN.