Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
NeurIPS 2020. Comprehensive analysis of authors, organizations, and countries.

This post analyzes what authors and organizations publish at NeurIPS 2020 this December, similar to the analysis I did for ICML 2020.

In addition to general insights (that I also found interesting), there are two collaboration graphs that I created, one between affiliations and one between authors. What's exciting is that these two networks are very different from each other, and the graph of authors is actually quite disconnected with lots of small groups of people (of size ~50) and large diameter (25 hops). Could be interesting in the future to understand why it's the case and what are these small groups of people.
GML YouTube Videos

I was pleasantly surprised to see there is YouTube playlist by Zak Jost that covers some aspects of GNNs, including an interview with DeepMind authors for using GNNs for physics.
CIKM 2020 stats

Dates: Oct 19-23
Where: Online
Price: €70

Graph papers can be found at paper digest.

• 970/397 full/short submissions (vs 1030/470 in 2019)
• 193/103 accepted (vs 202/107 in 2019)
• 20% / 26% acceptance rate (vs 19/21% in 2019)
• ~97 total graph papers (20% of total)
Open Catalyst Project

Facebook and CMU launch Open Catalyst Project, which contains the largest dataset for quantum chemistry predictions. The goal is to predict atomic interactions faster than quantum mechanical simulations (DFT), which could be translated as a graph regression task.
Graph Machine Learning research groups: Jiliang Tang

I
do a series of posts on the groups in graph research, previous post is here. The 17th is Jiliang Tang, coauthor of the book "Deep Learning on Graphs".

Jiliang Tang (~1974)
- Affiliation: Michigan State University
- Education: Ph.D. at Arizona State University in 2015 (advisor: Huan Liu)
- h-index 56
- Awards: best paper awards KDD, WSDM; Yahoo! awards; Distinguished Withrow Research Award; NSF Career Award
- Interests: graph neural networks, network analysis, anomaly detection on graphs
Genesis Therapeutic — a startup working on GNN drug discovery

Launched in November 2019 out of Stanford’s Pande Lab, Genesis Therapeutics is researching GNNs and graph generative models in the field of drug discovery. They recently announced their partnership with Genentech, large biotech company, to test their ML platform for pharma.
Fresh picks from ArXiv
Today at ArXiv: building graphs from pretrained language models, graph information bottleneck, and quantum entanglement ⚛️

If I forgot to mention your paper, please shoot me a message and I will update the post.

Conferences
- XLVIN: eXecuted Latent Value Iteration Nets NeurIPS-DeepRL 2020, with Petar Veličković
- Learning to Execute Programs with Instruction Pointer Attention Graph Neural Networks NeurIPS 2020
- Graph Information Bottleneck NeurIPS 2020, with Jure Leskovec
- Beta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs NeurIPS 2020, with Jure Leskovec
- Graph Geometry Interaction Learning NeurIPS 2020
- Rethinking pooling in graph neural networks NeurIPS 2020
- Heterogeneous Hypergraph Embedding for Graph Classification WSDM 2021
- Contextual Heterogeneous Graph Network for Human-Object Interaction Detection ECCV-2020

Graphs
- A Differentiable Relaxation of Graph Segmentation and Alignment for AMR Parsing with Ivan Titov
- Graph and graphon neural network stability with Alejandro Ribeiro
- Language Models are Open Knowledge Graphs
- Can entanglement hide behind triangle-free graphs?


Survey
- Model Extraction Attacks on Graph Neural Networks: Taxonomy and Realization
Python Bindings of JGraphT

There is a new python binding of popular java library JGraphT, which is exciting news for those who want efficiency when working with graphs (in addition to other recent news Nvidia GPU-accelerated package).

JGraphT is a java library that contains very efficient and generic graph data-structures along with a large collection of state-of-the-art algorithms. What's great is that it allows to use easy interface with java bindings across all OS (including Windows) without installing JVM. JGraphT is known for its efficiency, reliability, and large collection of graph algorithms including pagerank, flows, cuts, vertex covers, colorings, isomorphism checking and more.
Twitter Recsys 2020 Challenge

In a new post Michael Bronstein describes top-3 solutions of the recent Twitter RecSys challenge, with the goal of predicting user engagements (like, retweets, etc.) for the future tweets of the users. As expected, top places were using gradient boosting over neural networks, although some used BERT and other language models to build representations of the tweet. It would be interesting to see if we can use the fact that Twitter is a graph for better predictions of the models.
Some History on the Reconstruction Conjecture

I
already wrote about reconstruction conjecture (here and here) and must admit that it's my weakness: every so often you find some fact about it and continue falling down the rabbit hole. It's simplicity with sheer volume of possibilities is nothing like for any other problem. Two other good ones are graph isomorphism and P vs NP but those seem very hard to resolve, while for reconstruction conjecture you just need to get a pair of graphs as a counter-example — anyone can do it.

It seemed that very few people tried to disprove it and a general opinion, as far as I can judge, was that the conjecture holds. However, I obtained a very amusing and insightful presentation of a mathematician Allen Schwenk, a PhD student of another prominent mathematician Frank Harary, who contributed a lot to this conjecture. In this presentation he explains why he strongly convinced that the conjecture is false.

Here is a quote from our exchange that I think expresses the feeling of trying to solve the reconstruction conjecture: "...But I have returned to this problem every year for the past 43 years for at least 50 hours each year. I have tried maybe a hundred constructions, with no success. I think I have pondered this in bed as I fall asleep at least 1000 times. So either this demonstrates that my intended method is no good, or that I am not smart enough to make it work... I would like nothing more than to see this problem solved before I die. I still believe strongly that the conjecture is false, but possibly that examples are exponentially large.". Maybe some of my readers will solve it one day.
JAX Molecular Dynamics

I have already mentioned before exciting works from DeepMind about simulating dynamics of particles: each particle is a node in a graph and nearest neighbors are connected with each other, with the goal to predict acceleration at every time step that is provided to physics engine to predict the next state of the particles. The code and data were recently released here.

But there is another Google package JAX MD for performing molecular dynamics in JAX, a numpy-like package with autograd, purely in python (paper and colab notebook). Authors claim to have experiments accelerated in GPU with end-to-end training for hundreds of thousands of particles. The work has been accepted to NeurIPS 2020.