Graph Machine Learning
6.7K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Cleora Paper

I already wrote about Cleora, an unsupervised embedding library, now there is a paper explaining details of it. The algorithm is just some form of matrix multiplication, yet it shows better performance for link prediction metrics and running time than Pytorch-BigGraph, DeepWalk and others.
How to get started with Graph Machine Learning

In a new post, Aleksa Gordić talks in depth about graph ML, its applications and shares useful resources to get you started in this world.
Graphs and More Complex Structures for Learning and Reasoning Workshop

A workshop at AAAI 2021 featuring the talk about learning knowledge graph representations for zero-shot learning in NLP and vision.
Graph Neural Networks from the First Principles

Petar Veličković will give a talk on 17 Feb about how GNNs appeared in different disciplines and how you can derive GNNs from permutation invariance. Petar has long worked in this field, knowing inside and out graph nets, so I strongly recommend to visit his talk. The link is here.
Job Posting for Research Scientist at NEC Labs Europe

Several researcher positions are available at NEC Lab Europe, a research institute with a focus on CS/ML applications in life sciences. One includes working with Dr. Mathias Niepert who has been publishing many works in graph ML field. Deadline is 31st March.
Graph Machine Learning research groups: Austin R. Benson

I do a series of posts on the groups in graph research, previous post is here. The 23rd is Austin R. Benson, a professor at Cornell, who together with his students recently shook the graph community by showing that label propagation works really well compared to GNN.

Austin R. Benson (~1990)
- Affiliation: Cornell
- Education: Ph.D. at Stanford in 2017 (advisors: Jure Leskovec)
- h-index 21
- Awards: best research papers at KDD, ASONAM, Kavli Fellow
- Interests: label propagation, clustering, network algorithms
Learning mesh-based simulation with Graph Networks

Another work by DeepMind (ICLR '21) on how to simulate physical systems with GNN. The principle is the same as in their previous works: get a graph for a system, process it with GNN, obtain acceleration for each node, and provide it to Euler integrator to obtain positions of each node in the next step. Again, very cool visualizations.
Recent applications of expanders to graph algorithms

Informally, a graph is expander if the nodes are robustly connected, i.e. removing some edges would not break the connectivity. It has been used a lot to improve the running time of many graph algorithms. In this talk, there is a gentle introduction to expanders and their applications to static, dynamic, iterative, and distributed algorithms on graphs.
Graph Neural Networks for Binding Affinity Prediction

In-depth blog post about applications of GNN to drug discovery, and, in particular, to virtual screening for candidate molecules.
Postdoc position at EPFL

It's very interesting postdoc position at EPFL to work on molecule design. The following text is by Andreas Loukas.

We are hiring a postdoc to work on the interface between AI and computational protein design. The project will be carried out at EPFL in collaboration with Bruno Correia, Michael Bronstein, Pierre Vandergheynst, and the Swiss Data Science Center.

We offer a 2-year position in EPFL, a vibrant university (well.. post covid) located in one of the most beautiful countries. The salary is very competitive.

The researcher will partake in an interdisciplinary effort to design novel proteins using tools from deep learning. The ideal candidate combines i) practical deep learning/GNN know-how ii) experience with generative models and/or reinforcement learning. Knowledge of biology is not required--but a willingness to learn is.

Relevant work: https://tinyurl.com/1stzxmkj

If you are interested, send me by email: a motivation letter explaining how your expertise fits the current position, a CV, the names/addresses of three references, and three selected publications. We will start reviewing applications on the 15th of March.

Andreas Loukas (find email at andreasloukas.blog)
GNN User Group: meeting 2

The second meeting of the GNN user group organized by AWS and Nvidia. There are 3 presentations about GNN on GPU, CuGraph, and learning mechanisms of GNN. The event is free.
GNNSys'21 -- Workshop on Graph Neural Networks and Systems

A graph-related workshop organized at MLSys 2021, with submission deadline of 7 of March. This could be particularly interesting as it would highlight applications of GNNs in the real production systems.
The Easiest Unsolved Problem in Graph Theory

Our new blog post about reconstruction conjecture, a well-known graph theory problem with 80 years of results but no final proof yet. I have already written several posts in this channel about it and it to me it's one of the grand challenges in graph theory (along with graph isomorphism problem). It seems there is quite some progress, so I hope to see it being resolved during my lifetime. In the meantime, we considered graph families for which reconstruction conjecture is known to be true and tried to come up with the easiest family of graphs that is still not resolved and have very few vertices. The resulted family is a type of bidegreed graphs (close to regular) on 20 vertices, which is probably possible to verify on the computer (though it would take a year or so).
The Transformer Network for the Traveling Salesman Problem

(video and slides) Another great tutorial from Xavier Bresson on traveling salesman problem (TSP) and recent ML approaches to solve it. It gives a nice overview of the current solvers such as Concorde or Gurobi and their computational complexity.
GML Newsletter: Homophily, Heterophily, and Oversmoothing for GNNs

Apparently, Cora and OGB datasets are mostly assortative datasets, i.e. nodes of the same labels tend to be connected. In many real-world applications, it's not the case, i.e. nodes of different groups are connected, while within the groups the connections are sparse. Such datasets are called disassortative graphs.

What has been realized in 2020 and now in 2021 is that typical GNNs like GCN do not work well in disassortative graphs. So several GNN architectures were proposed to get good performance for these datasets. Not only these new GNNs work well on assortative and disassortative graphs, but also they solve the problem of oversmoothing, i.e. effectively designing many layers for GNNs.

In my new email newsletter I discuss this change from assortative to disassortative GNNs and its relation to oversmoothing. What's interesting is that existing approaches still do not rely explicitly on the labels, but rather learn parameters to account for heterophily. In the future, I think there will be more hacks how to integrate target labels directly into the GNN algorithm.