Graph Machine Learning
6.7K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Learning mesh-based simulation with Graph Networks

Another work by DeepMind (ICLR '21) on how to simulate physical systems with GNN. The principle is the same as in their previous works: get a graph for a system, process it with GNN, obtain acceleration for each node, and provide it to Euler integrator to obtain positions of each node in the next step. Again, very cool visualizations.
Recent applications of expanders to graph algorithms

Informally, a graph is expander if the nodes are robustly connected, i.e. removing some edges would not break the connectivity. It has been used a lot to improve the running time of many graph algorithms. In this talk, there is a gentle introduction to expanders and their applications to static, dynamic, iterative, and distributed algorithms on graphs.
Graph Neural Networks for Binding Affinity Prediction

In-depth blog post about applications of GNN to drug discovery, and, in particular, to virtual screening for candidate molecules.
Postdoc position at EPFL

It's very interesting postdoc position at EPFL to work on molecule design. The following text is by Andreas Loukas.

We are hiring a postdoc to work on the interface between AI and computational protein design. The project will be carried out at EPFL in collaboration with Bruno Correia, Michael Bronstein, Pierre Vandergheynst, and the Swiss Data Science Center.

We offer a 2-year position in EPFL, a vibrant university (well.. post covid) located in one of the most beautiful countries. The salary is very competitive.

The researcher will partake in an interdisciplinary effort to design novel proteins using tools from deep learning. The ideal candidate combines i) practical deep learning/GNN know-how ii) experience with generative models and/or reinforcement learning. Knowledge of biology is not required--but a willingness to learn is.

Relevant work: https://tinyurl.com/1stzxmkj

If you are interested, send me by email: a motivation letter explaining how your expertise fits the current position, a CV, the names/addresses of three references, and three selected publications. We will start reviewing applications on the 15th of March.

Andreas Loukas (find email at andreasloukas.blog)
GNN User Group: meeting 2

The second meeting of the GNN user group organized by AWS and Nvidia. There are 3 presentations about GNN on GPU, CuGraph, and learning mechanisms of GNN. The event is free.
GNNSys'21 -- Workshop on Graph Neural Networks and Systems

A graph-related workshop organized at MLSys 2021, with submission deadline of 7 of March. This could be particularly interesting as it would highlight applications of GNNs in the real production systems.
The Easiest Unsolved Problem in Graph Theory

Our new blog post about reconstruction conjecture, a well-known graph theory problem with 80 years of results but no final proof yet. I have already written several posts in this channel about it and it to me it's one of the grand challenges in graph theory (along with graph isomorphism problem). It seems there is quite some progress, so I hope to see it being resolved during my lifetime. In the meantime, we considered graph families for which reconstruction conjecture is known to be true and tried to come up with the easiest family of graphs that is still not resolved and have very few vertices. The resulted family is a type of bidegreed graphs (close to regular) on 20 vertices, which is probably possible to verify on the computer (though it would take a year or so).
The Transformer Network for the Traveling Salesman Problem

(video and slides) Another great tutorial from Xavier Bresson on traveling salesman problem (TSP) and recent ML approaches to solve it. It gives a nice overview of the current solvers such as Concorde or Gurobi and their computational complexity.
GML Newsletter: Homophily, Heterophily, and Oversmoothing for GNNs

Apparently, Cora and OGB datasets are mostly assortative datasets, i.e. nodes of the same labels tend to be connected. In many real-world applications, it's not the case, i.e. nodes of different groups are connected, while within the groups the connections are sparse. Such datasets are called disassortative graphs.

What has been realized in 2020 and now in 2021 is that typical GNNs like GCN do not work well in disassortative graphs. So several GNN architectures were proposed to get good performance for these datasets. Not only these new GNNs work well on assortative and disassortative graphs, but also they solve the problem of oversmoothing, i.e. effectively designing many layers for GNNs.

In my new email newsletter I discuss this change from assortative to disassortative GNNs and its relation to oversmoothing. What's interesting is that existing approaches still do not rely explicitly on the labels, but rather learn parameters to account for heterophily. In the future, I think there will be more hacks how to integrate target labels directly into the GNN algorithm.
Fresh picks from ArXiv
This week on ArXiv: 2 surveys on self-supervised graph learning, fair embeddings, and combined structural and positional node embeddings 🎭

If I forgot to mention your paper, please shoot me a message and I will update the post.

Survey
* Graph Self-Supervised Learning: A Survey with Philip S. Yu
* Graph-based Semi-supervised Learning: A Comprehensive Review
* Meta-Learning with Graph Neural Networks: Methods and Applications
* Benchmarking Graph Neural Networks on Link Prediction
* A Survey of RDF Stores & SPARQL Engines for Querying Knowledge Graphs

Embeddings
* Towards a Unified Framework for Fair and Stable Graph Representation Learning with Marinka Zitnik
* Node Proximity Is All You Need: Unified Structural and Positional Node and Graph Embedding with Danai Koutra
Theoretical Foundations of Graph Neural Networks

Video presentation by Petar Veličković who covers design, history, and applications of GNNs. A lot of interesting concepts such as permutation invariance and equivariance discussed. Slides can be found here.
Graph Machine Learning research groups: Tyler Derr

I do a series of posts on the groups in graph research, previous post is here. The 24th is Tyler Derr, a young professor graph ML, who proposed signed GNNs on graphs with negative links.

Tyler Derr (~1992)
- Affiliation: Vanderbilt University
- Education: Ph.D. at Michigan State University in 2020 (advisors: Jiliang Tang)
- h-index 10
- Awards: best papers at SDM
- Interests: adversarial attacks, graph neural networks
Paper Notes in Notion

Vitaly Kurin discovered a great format to track notes for the papers he reads. These are short and clean digestions of papers in the intersection of GNNs and RL and I would definitely recommend to look it up if you are studying the same papers. You can also create the same format in Notion by adding a new page (database -> list) and then clicking on New button selectin the properties that are necessary.
MLSys 2021 Conference

MLsys is one of the main conferences on applications of ML in real-world. Accepted papers for MLSys 2021 are available here. It will also feature GNN workshop and keynote speakers from NVIDIA, PyTorch, and others. Dates are April 5-9, 2021. Registration is 25$ for students.
Top-10 Research Papers in AI

A new blog post about the top-10 most cited papers in AI during the last 5 years. I looked at major AI conferences and journals (excluding CV and NLP conferences).

It was quite a refreshing experience to realize that many of what we use today by default have been discovered only within the last few years. Things like Adam, Batch Norm, GCNs, etc.