Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Self-supervised learning of GNNs

Self-supervised learning (SSL) is a paradigm of learning when we have large amounts unlabeled data and we want to get representation of the input which we can use later for the downstream tasks. The difference between unsupervised and self-supervised learning is that unsupervised learning attempts to learn a representation on a single input, while SSL assumes there is a model trained across several inputs.

Examples of unsupervised learning on graphs is graph kernels that boil down to counting some statistics on graphs (e.g. motifs) which would represent a graph. Examples of SSL is when you first create multiple views of the same graph (e.g. by permuting the edges) and then train a model to distinguish views of different graphs. DeepWalk, node2vec and other pre-GNN node embeddings are somewhere in between: they are usually applied to a single graph, but the concept could be well applied to learning representations on many graphs as well.

There is a recent boom in this area for graphs, so there are some fresh surveys available (here and here) as well as the awesome list of SSL-GNNs.
Awesome graph repos

Collections of methods and papers for specific graph topics.

Graph-based Deep Learning Literature β€” Links to Conference Publications and the top 10 most-cited publications, Related workshops, Surveys / Literature Reviews / Books in graph-based deep learning.

awesome-graph-classification β€” A collection of graph classification methods, covering embedding, deep learning, graph kernel and factorization papers with reference implementations.

Awesome-Graph-Neural-Networks β€” A collection of resources related with graph neural networks..

awesome-graph β€” A curated list of resources for graph databases and graph computing tools

awesome-knowledge-graph β€” A curated list of Knowledge Graph related learning materials, databases, tools and other resources.

awesome-knowledge-graph β€” A curated list of awesome knowledge graph tutorials, projects and communities.

Awesome-GNN-Recommendation β€” graph mining for recommender systems.

awesome-graph-attack-papers β€” links to works about adversarial attacks and defenses on graph data or GNNs.

Graph-Adversarial-Learning β€” Attack-related papers, Defense-related papers, Robustness Certification papers, etc., ranging from 2017 to 2021.

awesome-self-supervised-gnn β€” Papers about self-supervised learning on GNNs.

awesome-self-supervised-learning-for-graphs β€” A curated list for awesome self-supervised graph representation learning resources.

Awesome-Graph-Contrastive-Learning β€” Collection of resources related with Graph Contrastive Learning.
Graph Machine Learning research groups: Leman Akoglu

I do a series of posts on the groups in graph research, previous post is here. The 27th is Leman Akoglu, a professor at the Carnegie Mellon University, with interests in detecting anomalies in graphs.

Leman Akoglu (~1983)
- Affiliation: Carnegie Mellon University
- Education: Ph.D. at Duke University in 2012 (advisors: Christos Faloutsos)
- h-index 40
- Interests: anomaly detection, graph neural networks
- Awards: best research papers at PAKDD, SIAM SDD, ECML PKDD
Graph Neural Networks in Computational Biology

Slides from Petar VeličkoviΔ‡ about his journey on using machine learning algorithms on biological data.
GNN User Group: meeting 4

Fourth meeting of GNN user group will include talks from me (Sergey Ivanov) where I will talk about combination of GBDT and GNNs, and professor Pan Li from Purdue University who will speak about constructing structural features to improve representations in temporal networks. Please join us on Thusday!
Geometric Deep Learning Book

A new book by graph ML experts Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar VeličkoviΔ‡ on geometric deep learning is released. 156 pages on exploring symmetries that unifies different ML neural network architectures. An accompanying post nicely introduces the history of the geometry and its impact on the physics. It's exciting to see a categorization of many ML approaches from the perspective of the group theory.
Graph Representation Learning for Drug Discovery Slides

Slides from Jian Tang of the talk on de novo drug discovery and drug repurposing.
Invariant and equivariant layers with applications to GNN, PointNet and Transformers

A blog post by Marc Lelarge about invariant and equivariant functions and their relation to the universality and expressivity of GNN. As the main result they show that any invariant/equivariant function on n points can be represented as a sum of functions on each point independently.
Video: Workshop of Graph Neural Networks and Systems (GNNSys'21)

Very interesting videos from the workshop at MLSys 21 on GNNs in the industry. The talks include topics such as GNNs on graphcore's IPU, chip placement optimization, particle reconstruction at the large hadron collider and more.
GML In-Depth: three forms of self-supervised learning

My new in-depth newsletter on self-supervised learning with applications to graphs. There is an upcoming keynote talk from Alexei Efros at ICLR'21 about self-supervised learning and I was inspired by the motivations that he talks there. In particular, he explains that self-supervised learning is a way to reduce the role of humans in designing ML pipelines, which would allow neural nets to learn in a similar way as humans do. Self-supervised learning for graphs is an active area of research and there are good reasons for this: for applications such as drug or catalyst discovery, there are billions of unlabeled graphs from which we would like to extract as much relevant information as possible. So self-supervised learning is becoming a new paradigm for learning such useful representations.
Knowledge Graphs @ ICLR 2021

One and only Michael Galkin does it again with a superior digest of knowledge graph research at ICLR 2021. Topics include reasoning, temporal logics, and complex question answering in KGs: a lot of novel ideas and less SOTA-chasing work!
Fresh picks from ArXiv
This week on ArXiv: optimization properties of GNNs, review on sample-based approaches, and time zigzags for Ethereum price prediction πŸ’°

If I forgot to mention your paper, please shoot me a message and I will update the post.

Conferences
* Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders ACL 2021
* Neural Graph Matching based Collaborative Filtering SIGIR 2021
* Z-GCNETs: Time Zigzags at Graph Convolutional Networks for Time Series Forecasting ICML 2021
* Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth ICML 2021

Efficiency
* Scalable Graph Neural Network Training: The Case for Sampling
* VersaGNN: a Versatile accelerator for Graph neural networks
New Proof Reveals That Graphs With No Pentagons Are Fundamentally Different

A new article at Quanta about ErdΕ‘s–Hajnal conjecture, which states that any graph that forbids having some small subgraph will inevitably have a large clique or a large independent set. The article talks about a recent paper that confirms the conjecture for a special case which was deemed the hardest. Now there is a hope that the conjecture is true for the general case.
Constructions in combinatorics via neural networks

I have been fascinated about potential of using machine learning for combinatorial problems and have written multiple posts (here and here) and a survey about this. And as such it was exciting to see a work that applies RL framework to disprove several combinatorial conjectures.

The algorithm is very simple: generate many graphs with MLP, select the top-X of them, use cross-entropy to update MLP. So it does not use recent advances in RL, neither in GML to care about invariance of the input. So there is a room for improvement. Also it generates graphs of pre-determined size, so if a counterexample has a big order it would be difficult to know in advance. But it would be very interesting to apply this framework to more complicated conjectures such as reconstruction conjecture.