Graph Machine Learning
6.7K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Theoretical Foundations of Graph Neural Networks

Video presentation by Petar Veličković who covers design, history, and applications of GNNs. A lot of interesting concepts such as permutation invariance and equivariance discussed. Slides can be found here.
Graph Machine Learning research groups: Tyler Derr

I do a series of posts on the groups in graph research, previous post is here. The 24th is Tyler Derr, a young professor graph ML, who proposed signed GNNs on graphs with negative links.

Tyler Derr (~1992)
- Affiliation: Vanderbilt University
- Education: Ph.D. at Michigan State University in 2020 (advisors: Jiliang Tang)
- h-index 10
- Awards: best papers at SDM
- Interests: adversarial attacks, graph neural networks
Paper Notes in Notion

Vitaly Kurin discovered a great format to track notes for the papers he reads. These are short and clean digestions of papers in the intersection of GNNs and RL and I would definitely recommend to look it up if you are studying the same papers. You can also create the same format in Notion by adding a new page (database -> list) and then clicking on New button selectin the properties that are necessary.
MLSys 2021 Conference

MLsys is one of the main conferences on applications of ML in real-world. Accepted papers for MLSys 2021 are available here. It will also feature GNN workshop and keynote speakers from NVIDIA, PyTorch, and others. Dates are April 5-9, 2021. Registration is 25$ for students.
Top-10 Research Papers in AI

A new blog post about the top-10 most cited papers in AI during the last 5 years. I looked at major AI conferences and journals (excluding CV and NLP conferences).

It was quite a refreshing experience to realize that many of what we use today by default have been discovered only within the last few years. Things like Adam, Batch Norm, GCNs, etc.
Deep Learning and Combinatorial Optimization IPAM Workshop

A great workshop on the intersection of ML, RL, GNNs, and combinatorial optimization. Videos are available. Topics include applications of ML to chip design, TSP, physics, integer programming and more.
Different styles of communication 😊
If We Draw Graphs Like This, We Can Change Computers Forever

The title is catchy, but the article is "only" about improvement for dynamic planarity testing problem. Planarity testing is well-studied problem for testing if a graph can be drawn without crossing edges and O(n) algorithms are known. This article on the other hand studies the case when the edges may be added and removed and the question is how to redraw the graph so that it becomes planar. The results were published at STOC'20.
A Complete Beginner's Guide to G-Invariant Neural Networks

A tutorial by S. Chandra Mouli and Bruno Ribeiro about G-invariant neural networks, eigenvectors, invariant subspaces, transformation groups, Reynolds operator, and more. Soon, there should be more tutorials on the topic of invariance and linear algebra.
Graph Transformer: A Generalization of Transformers to Graphs

A blog post by Vijay Prakash Dwivedi that discusses their paper A Generalization of Transformer Networks to Graphs with Xavier Bresson at 2021 AAAI Workshop (DLG-AAAI’21). It looks like a generalization of GAT network with batch norm and positional encodings. It still though aggregates via local neighborhoods.

My feeling after studying heterophily is that we will see more works that go beyond local neighborhoods and maybe will define neighborhoods not as something that is given by the graph topology but as something we have to learn. For example, we can define attention from each node to all other nodes in the graph and treat the distances in the graph as additional features. It could be difficult to scale so sampling methods should be employed I guess, but it seems allowing the network to decide which nodes are important for aggregation could be a better way to go.
PyTorch Geometric Temporal

PyG-Temporal is an extension of PyG for temporal graphs. It now includes more than 10 GNN models and several datasets. With world being dynamic I see more and more applications when standard GNN wouldn't work and one needs to resort to dynamic GNNs.
Large-scale graph machine learning challenge (OGB-LSC) at KDD Cup 2021

OGB-LSC is a collection of three graph datasets—PCQM4M-LSC, WikiKG90M-LSC, and MAG240M-LSC—that are orders of magnitude larger than existing ones. The three datasets correspond to link prediction, graph regression, and node classification tasks, respectively. The goal of OGB-LSC is to empower the community to discover innovative solutions for large-scale graph ML.

The competition will be from March 15th, 2021 until June 8th, 2021 and the winners will be notified by mid-June 2021. The winners will be honored at the KDD 2021 opening ceremony and will present their solutions at the KDD Cup workshop during the conference.

The graphs are indeed big, with the largest size 168 GB, and it's interesting what approaches can be used to solve these problems.
A Tale of Three Implicit Planners and the XLVIN agent

A video presentation by Petar Veličković about implicit planners, which could be seen as a middle-ground between model-based and model-free approaches for RL planning problems. The talk covers three popular implicit planners: VIN, ATreeC and XLVIN. All three focus on the recently popularised idea of algorithmically aligning to a planning algorithm, but with different realisations.
Graph Machine Learning research groups: Yaron Lipman

I do a series of posts on the groups in graph research, previous post is here. The 25th is Yaron Lipman, a professor in Israel, who has been co-authoring many papers on equivariances and the power of GNNs.

Yaron Lipman (~1980)
- Affiliation: Weizmann Institute of Science
- Education: Ph.D. at Tel Aviv University in 2008 (advisors: David Levin and Daniel Cohen-Or)
- h-index 41
- Interests: geometric deep learning, meshes, 3d point clouds, equivariant networks
GML Express: large-scale challenge, top papers in AI, and implicit planners.

Another issue of my newsletter. So I finally solved a struggle for me what to write this newsletter about: news or insights. GML express will cover news (which you mostly should get anyway from this channel) and GML In-Depth should cover my insights.

In this GML express you will find a bunch of learning materials, recent video presentations, blog posts, and announcements.
DIG: Dive into Graphs library

A new python library DIG (and paper) in PyTorch for several graph tasks:
* Graph Generation
* Self-supervised Learning on Graphs
* Explainability of Graph Neural Networks
* Deep Learning on 3D Graphs