Recent applications of expanders to graph algorithms
Informally, a graph is expander if the nodes are robustly connected, i.e. removing some edges would not break the connectivity. It has been used a lot to improve the running time of many graph algorithms. In this talk, there is a gentle introduction to expanders and their applications to static, dynamic, iterative, and distributed algorithms on graphs.
Informally, a graph is expander if the nodes are robustly connected, i.e. removing some edges would not break the connectivity. It has been used a lot to improve the running time of many graph algorithms. In this talk, there is a gentle introduction to expanders and their applications to static, dynamic, iterative, and distributed algorithms on graphs.
YouTube
Recent Applications of Expanders to Graph Algorithms - Thatchaphol Saranurak (Uni. of Michigan)
Abstract: Expanders enable us to make exciting progress in several areas of graph algorithms in the last few years. As examples, we show
(1) the first deterministic almost-linear time algorithms for solving Laplacian systems and computing approximate maxโฆ
(1) the first deterministic almost-linear time algorithms for solving Laplacian systems and computing approximate maxโฆ
Graph Neural Networks for Binding Affinity Prediction
In-depth blog post about applications of GNN to drug discovery, and, in particular, to virtual screening for candidate molecules.
In-depth blog post about applications of GNN to drug discovery, and, in particular, to virtual screening for candidate molecules.
Medium
Graph Neural Networks for Binding Affinity Prediction
AI in Drug Discovery
Graph workshop at AAAI 2021: videos
Videos for a recent graph workshop at AAAI 2021 are available online. There are several keynotes including William Hamilton and Stephen Bach, as well as 2-min flash paper presentations.
Videos for a recent graph workshop at AAAI 2021 are available online. There are several keynotes including William Hamilton and Stephen Bach, as well as 2-min flash paper presentations.
Google
Home
The study of complex graphs is a highly interdisciplinary field that aims to study complex systems by using mathematical models, physical laws, inference and learning algorithms, etc. Complex systems are often characterized by several components that interactโฆ
Postdoc position at EPFL
It's very interesting postdoc position at EPFL to work on molecule design. The following text is by Andreas Loukas.
We are hiring a postdoc to work on the interface between AI and computational protein design. The project will be carried out at EPFL in collaboration with Bruno Correia, Michael Bronstein, Pierre Vandergheynst, and the Swiss Data Science Center.
We offer a 2-year position in EPFL, a vibrant university (well.. post covid) located in one of the most beautiful countries. The salary is very competitive.
The researcher will partake in an interdisciplinary effort to design novel proteins using tools from deep learning. The ideal candidate combines i) practical deep learning/GNN know-how ii) experience with generative models and/or reinforcement learning. Knowledge of biology is not required--but a willingness to learn is.
Relevant work: https://tinyurl.com/1stzxmkj
If you are interested, send me by email: a motivation letter explaining how your expertise fits the current position, a CV, the names/addresses of three references, and three selected publications. We will start reviewing applications on the 15th of March.
Andreas Loukas (find email at andreasloukas.blog)
It's very interesting postdoc position at EPFL to work on molecule design. The following text is by Andreas Loukas.
We are hiring a postdoc to work on the interface between AI and computational protein design. The project will be carried out at EPFL in collaboration with Bruno Correia, Michael Bronstein, Pierre Vandergheynst, and the Swiss Data Science Center.
We offer a 2-year position in EPFL, a vibrant university (well.. post covid) located in one of the most beautiful countries. The salary is very competitive.
The researcher will partake in an interdisciplinary effort to design novel proteins using tools from deep learning. The ideal candidate combines i) practical deep learning/GNN know-how ii) experience with generative models and/or reinforcement learning. Knowledge of biology is not required--but a willingness to learn is.
Relevant work: https://tinyurl.com/1stzxmkj
If you are interested, send me by email: a motivation letter explaining how your expertise fits the current position, a CV, the names/addresses of three references, and three selected publications. We will start reviewing applications on the 15th of March.
Andreas Loukas (find email at andreasloukas.blog)
actu.epfl.ch
Predicting a protein's behavior from its appearance
Researchers at EPFL have developed a new way to predict a proteinโs interactions with other proteins and biomolecules, and its biochemical activity, merely by observing its surface. The method, published in open-source format, opens up new possibilities forโฆ
GNN User Group: meeting 2
The second meeting of the GNN user group organized by AWS and Nvidia. There are 3 presentations about GNN on GPU, CuGraph, and learning mechanisms of GNN. The event is free.
The second meeting of the GNN user group organized by AWS and Nvidia. There are 3 presentations about GNN on GPU, CuGraph, and learning mechanisms of GNN. The event is free.
Eventbrite
Graph Neural Networks User Group
Fresh picks from ArXiv
This week on ArXiv: improved generalization bound, deblurring images with Laplacians, and new datasets for drug discovery ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Generalization bounds for graph convolutional neural networks via Rademacher complexity
* SSFG: Stochastically Scaling Features and Gradients for Regularizing Graph Convolution Networks
* Dissecting the Diffusion Process in Linear Graph Convolutional Networks
* E(n) Equivariant Graph Neural Networks with Max Welling
* Combinatorial optimization and reasoning with graph neural networks with Petar Veliฤkoviฤ
* Topological Graph Neural Networks with Karsten Borgwardt
Applications
* Graph Laplacian for image deblurring
* Interpretable Stability Bounds for Spectral Graph Filters
* On the Similarity between von Neumann Graph Entropy and Structural Information: Interpretation, Computation, and Applications
* A Review of Biomedical Datasets Relating to Drug Discovery: A Knowledge Graph Perspective with William Hamilton
This week on ArXiv: improved generalization bound, deblurring images with Laplacians, and new datasets for drug discovery ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Generalization bounds for graph convolutional neural networks via Rademacher complexity
* SSFG: Stochastically Scaling Features and Gradients for Regularizing Graph Convolution Networks
* Dissecting the Diffusion Process in Linear Graph Convolutional Networks
* E(n) Equivariant Graph Neural Networks with Max Welling
* Combinatorial optimization and reasoning with graph neural networks with Petar Veliฤkoviฤ
* Topological Graph Neural Networks with Karsten Borgwardt
Applications
* Graph Laplacian for image deblurring
* Interpretable Stability Bounds for Spectral Graph Filters
* On the Similarity between von Neumann Graph Entropy and Structural Information: Interpretation, Computation, and Applications
* A Review of Biomedical Datasets Relating to Drug Discovery: A Knowledge Graph Perspective with William Hamilton
arXiv.org
Combinatorial optimization and reasoning with graph neural networks
Combinatorial optimization is a well-established area in operations research and computer science. Until recently, its methods have focused on solving problem instances in isolation, ignoring that...
GNNSys'21 -- Workshop on Graph Neural Networks and Systems
A graph-related workshop organized at MLSys 2021, with submission deadline of 7 of March. This could be particularly interesting as it would highlight applications of GNNs in the real production systems.
A graph-related workshop organized at MLSys 2021, with submission deadline of 7 of March. This could be particularly interesting as it would highlight applications of GNNs in the real production systems.
The Easiest Unsolved Problem in Graph Theory
Our new blog post about reconstruction conjecture, a well-known graph theory problem with 80 years of results but no final proof yet. I have already written several posts in this channel about it and it to me it's one of the grand challenges in graph theory (along with graph isomorphism problem). It seems there is quite some progress, so I hope to see it being resolved during my lifetime. In the meantime, we considered graph families for which reconstruction conjecture is known to be true and tried to come up with the easiest family of graphs that is still not resolved and have very few vertices. The resulted family is a type of bidegreed graphs (close to regular) on 20 vertices, which is probably possible to verify on the computer (though it would take a year or so).
Our new blog post about reconstruction conjecture, a well-known graph theory problem with 80 years of results but no final proof yet. I have already written several posts in this channel about it and it to me it's one of the grand challenges in graph theory (along with graph isomorphism problem). It seems there is quite some progress, so I hope to see it being resolved during my lifetime. In the meantime, we considered graph families for which reconstruction conjecture is known to be true and tried to come up with the easiest family of graphs that is still not resolved and have very few vertices. The resulted family is a type of bidegreed graphs (close to regular) on 20 vertices, which is probably possible to verify on the computer (though it would take a year or so).
Medium
The Easiest Unsolved Problem in Graph Theory
Graph theory has a long history of problems being solved by amateur mathematicians. Do you want to try yourself to become one of them?
The Transformer Network for the Traveling Salesman Problem
(video and slides) Another great tutorial from Xavier Bresson on traveling salesman problem (TSP) and recent ML approaches to solve it. It gives a nice overview of the current solvers such as Concorde or Gurobi and their computational complexity.
(video and slides) Another great tutorial from Xavier Bresson on traveling salesman problem (TSP) and recent ML approaches to solve it. It gives a nice overview of the current solvers such as Concorde or Gurobi and their computational complexity.
GML Newsletter: Homophily, Heterophily, and Oversmoothing for GNNs
Apparently, Cora and OGB datasets are mostly assortative datasets, i.e. nodes of the same labels tend to be connected. In many real-world applications, it's not the case, i.e. nodes of different groups are connected, while within the groups the connections are sparse. Such datasets are called disassortative graphs.
What has been realized in 2020 and now in 2021 is that typical GNNs like GCN do not work well in disassortative graphs. So several GNN architectures were proposed to get good performance for these datasets. Not only these new GNNs work well on assortative and disassortative graphs, but also they solve the problem of oversmoothing, i.e. effectively designing many layers for GNNs.
In my new email newsletter I discuss this change from assortative to disassortative GNNs and its relation to oversmoothing. What's interesting is that existing approaches still do not rely explicitly on the labels, but rather learn parameters to account for heterophily. In the future, I think there will be more hacks how to integrate target labels directly into the GNN algorithm.
Apparently, Cora and OGB datasets are mostly assortative datasets, i.e. nodes of the same labels tend to be connected. In many real-world applications, it's not the case, i.e. nodes of different groups are connected, while within the groups the connections are sparse. Such datasets are called disassortative graphs.
What has been realized in 2020 and now in 2021 is that typical GNNs like GCN do not work well in disassortative graphs. So several GNN architectures were proposed to get good performance for these datasets. Not only these new GNNs work well on assortative and disassortative graphs, but also they solve the problem of oversmoothing, i.e. effectively designing many layers for GNNs.
In my new email newsletter I discuss this change from assortative to disassortative GNNs and its relation to oversmoothing. What's interesting is that existing approaches still do not rely explicitly on the labels, but rather learn parameters to account for heterophily. In the future, I think there will be more hacks how to integrate target labels directly into the GNN algorithm.
Graph Machine Learning
GML Newsletter: Homophily, Heterophily, and Oversmoothing for GNNs
"Birds of a feather flock together"
Fresh picks from ArXiv
This week on ArXiv: 2 surveys on self-supervised graph learning, fair embeddings, and combined structural and positional node embeddings ๐ญ
If I forgot to mention your paper, please shoot me a message and I will update the post.
Survey
* Graph Self-Supervised Learning: A Survey with Philip S. Yu
* Graph-based Semi-supervised Learning: A Comprehensive Review
* Meta-Learning with Graph Neural Networks: Methods and Applications
* Benchmarking Graph Neural Networks on Link Prediction
* A Survey of RDF Stores & SPARQL Engines for Querying Knowledge Graphs
Embeddings
* Towards a Unified Framework for Fair and Stable Graph Representation Learning with Marinka Zitnik
* Node Proximity Is All You Need: Unified Structural and Positional Node and Graph Embedding with Danai Koutra
This week on ArXiv: 2 surveys on self-supervised graph learning, fair embeddings, and combined structural and positional node embeddings ๐ญ
If I forgot to mention your paper, please shoot me a message and I will update the post.
Survey
* Graph Self-Supervised Learning: A Survey with Philip S. Yu
* Graph-based Semi-supervised Learning: A Comprehensive Review
* Meta-Learning with Graph Neural Networks: Methods and Applications
* Benchmarking Graph Neural Networks on Link Prediction
* A Survey of RDF Stores & SPARQL Engines for Querying Knowledge Graphs
Embeddings
* Towards a Unified Framework for Fair and Stable Graph Representation Learning with Marinka Zitnik
* Node Proximity Is All You Need: Unified Structural and Positional Node and Graph Embedding with Danai Koutra
Video and slides: GNN User Group meeting 2
In the second meeting of GNN user group, there is a discussion of new release of DGL, graph analytics on GPU, as well as new approaches for training GNNs, including those on disassortative graphs. Slides can be found in the slack channel.
In the second meeting of GNN user group, there is a discussion of new release of DGL, graph analytics on GPU, as well as new approaches for training GNNs, including those on disassortative graphs. Slides can be found in the slack channel.
YouTube
Graph Neural Networks User Group Meeting on Feb 25, 2021
Logistics and announcement on the newest release of DGL 0.6.
Our featured speakers are:
4:00 - 4:25 (PST): Gunrock: Graph Analytics on GPU (Dr. John Owens, University of California, Davis)
4:25 - 4:50 (PST): NVIDIA CuGraph - An Open-Source Package forโฆ
Our featured speakers are:
4:00 - 4:25 (PST): Gunrock: Graph Analytics on GPU (Dr. John Owens, University of California, Davis)
4:25 - 4:50 (PST): NVIDIA CuGraph - An Open-Source Package forโฆ
Theoretical Foundations of Graph Neural Networks
Video presentation by Petar Veliฤkoviฤ who covers design, history, and applications of GNNs. A lot of interesting concepts such as permutation invariance and equivariance discussed. Slides can be found here.
Video presentation by Petar Veliฤkoviฤ who covers design, history, and applications of GNNs. A lot of interesting concepts such as permutation invariance and equivariance discussed. Slides can be found here.
YouTube
Theoretical Foundations of Graph Neural Networks
Deriving graph neural networks (GNNs) from first principles, motivating their use, and explaining how they have emerged along several related research lines.
Computer Laboratory Wednesday Seminar, 17 February 2021
Slide deck: https://petar-v.com/talks/GNNโฆ
Computer Laboratory Wednesday Seminar, 17 February 2021
Slide deck: https://petar-v.com/talks/GNNโฆ
Trapped in a Desert Maze: AlphaZero learns to deal with graphs
A fun video and blog post about AlphaZero playing a simple game of node covering in graphs. I wish there is some human interface to play against these creatures.
A fun video and blog post about AlphaZero playing a simple game of node covering in graphs. I wish there is some human interface to play against these creatures.
YouTube
Battle of A.I. wits: 2 bots trapped in a 3D desert maze
This channel documents our research into the behavior of AI agents, or creatures, living on graphs. In this episode, we describe the development of the "Ringo" test scenario. Then we show two sample "games", set in a 3D environment. The environment is a 3Dโฆ
Graph Machine Learning research groups: Tyler Derr
I do a series of posts on the groups in graph research, previous post is here. The 24th is Tyler Derr, a young professor graph ML, who proposed signed GNNs on graphs with negative links.
Tyler Derr (~1992)
- Affiliation: Vanderbilt University
- Education: Ph.D. at Michigan State University in 2020 (advisors: Jiliang Tang)
- h-index 10
- Awards: best papers at SDM
- Interests: adversarial attacks, graph neural networks
I do a series of posts on the groups in graph research, previous post is here. The 24th is Tyler Derr, a young professor graph ML, who proposed signed GNNs on graphs with negative links.
Tyler Derr (~1992)
- Affiliation: Vanderbilt University
- Education: Ph.D. at Michigan State University in 2020 (advisors: Jiliang Tang)
- h-index 10
- Awards: best papers at SDM
- Interests: adversarial attacks, graph neural networks
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Austin R. Benson
I do a series of posts on the groups in graph research, previous post is here. The 23rd is Austin R. Benson, a professor at Cornell, who together with his students recently shook the graph communityโฆ
I do a series of posts on the groups in graph research, previous post is here. The 23rd is Austin R. Benson, a professor at Cornell, who together with his students recently shook the graph communityโฆ
Paper Notes in Notion
Vitaly Kurin discovered a great format to track notes for the papers he reads. These are short and clean digestions of papers in the intersection of GNNs and RL and I would definitely recommend to look it up if you are studying the same papers. You can also create the same format in Notion by adding a new page (database -> list) and then clicking on New button selectin the properties that are necessary.
Vitaly Kurin discovered a great format to track notes for the papers he reads. These are short and clean digestions of papers in the intersection of GNNs and RL and I would definitely recommend to look it up if you are studying the same papers. You can also create the same format in Notion by adding a new page (database -> list) and then clicking on New button selectin the properties that are necessary.
yobispace on Notion
Paper Notes by Vitaly Kurin | Notion
Meta-World: A Benchmark and Evaluation for Multi-Task and Meta Reinforcement Learning
Fresh picks from ArXiv
This week on ArXiv: unsupervised community detection, reducing variance with sampling, and GNNs for quantum calculations ๐งฎ
If I forgot to mention your paper, please shoot me a message and I will update the post.
Algorithms
* Graph Force Learning
* Recurrent Graph Neural Network Algorithm for Unsupervised Network Community Detection
* ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations
* Autobahn: Automorphism-based Graph Neural Nets
* Graph Convolutional Embeddings for Recommender Systems SIGIR 2021
* On the Importance of Sampling in Learning Graph Convolutional Networks
* Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework
Software
* Implementing graph neural networks with TensorFlow-Keras
* Large Graph Convolutional Network Training with GPU-Oriented Data Communication Architecture
Survey
* Deep Graph Structure Learning for Robust Representations: A Survey
This week on ArXiv: unsupervised community detection, reducing variance with sampling, and GNNs for quantum calculations ๐งฎ
If I forgot to mention your paper, please shoot me a message and I will update the post.
Algorithms
* Graph Force Learning
* Recurrent Graph Neural Network Algorithm for Unsupervised Network Community Detection
* ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations
* Autobahn: Automorphism-based Graph Neural Nets
* Graph Convolutional Embeddings for Recommender Systems SIGIR 2021
* On the Importance of Sampling in Learning Graph Convolutional Networks
* Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework
Software
* Implementing graph neural networks with TensorFlow-Keras
* Large Graph Convolutional Network Training with GPU-Oriented Data Communication Architecture
Survey
* Deep Graph Structure Learning for Robust Representations: A Survey
MLSys 2021 Conference
MLsys is one of the main conferences on applications of ML in real-world. Accepted papers for MLSys 2021 are available here. It will also feature GNN workshop and keynote speakers from NVIDIA, PyTorch, and others. Dates are April 5-9, 2021. Registration is 25$ for students.
MLsys is one of the main conferences on applications of ML in real-world. Accepted papers for MLSys 2021 are available here. It will also feature GNN workshop and keynote speakers from NVIDIA, PyTorch, and others. Dates are April 5-9, 2021. Registration is 25$ for students.
mlsys.org
MLSys 2021 Schedule
MLSys Website
Top-10 Research Papers in AI
A new blog post about the top-10 most cited papers in AI during the last 5 years. I looked at major AI conferences and journals (excluding CV and NLP conferences).
It was quite a refreshing experience to realize that many of what we use today by default have been discovered only within the last few years. Things like Adam, Batch Norm, GCNs, etc.
A new blog post about the top-10 most cited papers in AI during the last 5 years. I looked at major AI conferences and journals (excluding CV and NLP conferences).
It was quite a refreshing experience to realize that many of what we use today by default have been discovered only within the last few years. Things like Adam, Batch Norm, GCNs, etc.
Medium
Top-10 Research Papers in AI
Each year scientists from around the world publish thousands of research papers in AI but only a few of them reach wide audiences and make a global impact in the world. Below are the top-10 mostโฆ
Deep Learning and Combinatorial Optimization IPAM Workshop
A great workshop on the intersection of ML, RL, GNNs, and combinatorial optimization. Videos are available. Topics include applications of ML to chip design, TSP, physics, integer programming and more.
A great workshop on the intersection of ML, RL, GNNs, and combinatorial optimization. Videos are available. Topics include applications of ML to chip design, TSP, physics, integer programming and more.
IPAM
Deep Learning and Combinatorial Optimization - IPAM