Weisfeiler and Lehman Go Topological: Message Passing Simplical Networks
A video presentation (and slides) by Cristian Bodnar & Fabrizio Frasca on a new type of GNNs that defines neighborhoods based on the simplical complexes of a graph. It goes quite deep into the theory with the supporting experiments in graph isomorphism, graph classification, and trajectory disambiguation.
A video presentation (and slides) by Cristian Bodnar & Fabrizio Frasca on a new type of GNNs that defines neighborhoods based on the simplical complexes of a graph. It goes quite deep into the theory with the supporting experiments in graph isomorphism, graph classification, and trajectory disambiguation.
Videos from CS224W
A legendary Stanford CS224W course on graph ML now releases videos on YouTube for 2021. Promised to be 2 lectures each week. Slides available on the site too (homeworks are still missing).
A legendary Stanford CS224W course on graph ML now releases videos on YouTube for 2021. Promised to be 2 lectures each week. Slides available on the site too (homeworks are still missing).
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 1.1 - Why Graphs
For more information about Stanfordโs Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Bu1w3n
Jure Leskovec
Computer Science, PhD
Graphs are a general language for describing and analyzing entities with relations/interactions.โฆ
Jure Leskovec
Computer Science, PhD
Graphs are a general language for describing and analyzing entities with relations/interactions.โฆ
Fresh picks from ArXiv
This week on ArXiv: equivalence of graph matching and GED, hyperbolic GNNs, and de-anon of blockchain ๐ฒ
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks with Philip S. Yu
* Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks with Philip S. Yu
* Generative Causal Explanations for Graph Neural Networks
* DistGNN: Scalable Distributed Training for Large-Scale Graph Neural Networks
* Identity Inference on Blockchain using Graph Neural Network
Conferences
* MEG: Generating Molecular Counterfactual Explanations for Deep Graph Networks IJCNN 2021
* FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks MLSys Workshop 2021
* Search to aggregate neighborhood for graph neural network ICDE 2021
Hyperbolic
* A Hyperbolic-to-Hyperbolic Graph Convolutional Network
* Lorentzian Graph Convolutional Networks
Math
* On the unification of the graph edit distance and graph matching problems
This week on ArXiv: equivalence of graph matching and GED, hyperbolic GNNs, and de-anon of blockchain ๐ฒ
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks with Philip S. Yu
* Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks with Philip S. Yu
* Generative Causal Explanations for Graph Neural Networks
* DistGNN: Scalable Distributed Training for Large-Scale Graph Neural Networks
* Identity Inference on Blockchain using Graph Neural Network
Conferences
* MEG: Generating Molecular Counterfactual Explanations for Deep Graph Networks IJCNN 2021
* FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks MLSys Workshop 2021
* Search to aggregate neighborhood for graph neural network ICDE 2021
Hyperbolic
* A Hyperbolic-to-Hyperbolic Graph Convolutional Network
* Lorentzian Graph Convolutional Networks
Math
* On the unification of the graph edit distance and graph matching problems
Self-supervised learning of GNNs
Self-supervised learning (SSL) is a paradigm of learning when we have large amounts unlabeled data and we want to get representation of the input which we can use later for the downstream tasks. The difference between unsupervised and self-supervised learning is that unsupervised learning attempts to learn a representation on a single input, while SSL assumes there is a model trained across several inputs.
Examples of unsupervised learning on graphs is graph kernels that boil down to counting some statistics on graphs (e.g. motifs) which would represent a graph. Examples of SSL is when you first create multiple views of the same graph (e.g. by permuting the edges) and then train a model to distinguish views of different graphs. DeepWalk, node2vec and other pre-GNN node embeddings are somewhere in between: they are usually applied to a single graph, but the concept could be well applied to learning representations on many graphs as well.
There is a recent boom in this area for graphs, so there are some fresh surveys available (here and here) as well as the awesome list of SSL-GNNs.
Self-supervised learning (SSL) is a paradigm of learning when we have large amounts unlabeled data and we want to get representation of the input which we can use later for the downstream tasks. The difference between unsupervised and self-supervised learning is that unsupervised learning attempts to learn a representation on a single input, while SSL assumes there is a model trained across several inputs.
Examples of unsupervised learning on graphs is graph kernels that boil down to counting some statistics on graphs (e.g. motifs) which would represent a graph. Examples of SSL is when you first create multiple views of the same graph (e.g. by permuting the edges) and then train a model to distinguish views of different graphs. DeepWalk, node2vec and other pre-GNN node embeddings are somewhere in between: they are usually applied to a single graph, but the concept could be well applied to learning representations on many graphs as well.
There is a recent boom in this area for graphs, so there are some fresh surveys available (here and here) as well as the awesome list of SSL-GNNs.
Awesome graph repos
Collections of methods and papers for specific graph topics.
Graph-based Deep Learning Literature โ Links to Conference Publications and the top 10 most-cited publications, Related workshops, Surveys / Literature Reviews / Books in graph-based deep learning.
awesome-graph-classification โ A collection of graph classification methods, covering embedding, deep learning, graph kernel and factorization papers with reference implementations.
Awesome-Graph-Neural-Networks โ A collection of resources related with graph neural networks..
awesome-graph โ A curated list of resources for graph databases and graph computing tools
awesome-knowledge-graph โ A curated list of Knowledge Graph related learning materials, databases, tools and other resources.
awesome-knowledge-graph โ A curated list of awesome knowledge graph tutorials, projects and communities.
Awesome-GNN-Recommendation โ graph mining for recommender systems.
awesome-graph-attack-papers โ links to works about adversarial attacks and defenses on graph data or GNNs.
Graph-Adversarial-Learning โ Attack-related papers, Defense-related papers, Robustness Certification papers, etc., ranging from 2017 to 2021.
awesome-self-supervised-gnn โ Papers about self-supervised learning on GNNs.
awesome-self-supervised-learning-for-graphs โ A curated list for awesome self-supervised graph representation learning resources.
Awesome-Graph-Contrastive-Learning โ Collection of resources related with Graph Contrastive Learning.
Collections of methods and papers for specific graph topics.
Graph-based Deep Learning Literature โ Links to Conference Publications and the top 10 most-cited publications, Related workshops, Surveys / Literature Reviews / Books in graph-based deep learning.
awesome-graph-classification โ A collection of graph classification methods, covering embedding, deep learning, graph kernel and factorization papers with reference implementations.
Awesome-Graph-Neural-Networks โ A collection of resources related with graph neural networks..
awesome-graph โ A curated list of resources for graph databases and graph computing tools
awesome-knowledge-graph โ A curated list of Knowledge Graph related learning materials, databases, tools and other resources.
awesome-knowledge-graph โ A curated list of awesome knowledge graph tutorials, projects and communities.
Awesome-GNN-Recommendation โ graph mining for recommender systems.
awesome-graph-attack-papers โ links to works about adversarial attacks and defenses on graph data or GNNs.
Graph-Adversarial-Learning โ Attack-related papers, Defense-related papers, Robustness Certification papers, etc., ranging from 2017 to 2021.
awesome-self-supervised-gnn โ Papers about self-supervised learning on GNNs.
awesome-self-supervised-learning-for-graphs โ A curated list for awesome self-supervised graph representation learning resources.
Awesome-Graph-Contrastive-Learning โ Collection of resources related with Graph Contrastive Learning.
GitHub
GitHub - naganandy/graph-based-deep-learning-literature: links to conference publications in graph-based deep learning
links to conference publications in graph-based deep learning - naganandy/graph-based-deep-learning-literature
Graph Machine Learning research groups: Leman Akoglu
I do a series of posts on the groups in graph research, previous post is here. The 27th is Leman Akoglu, a professor at the Carnegie Mellon University, with interests in detecting anomalies in graphs.
Leman Akoglu (~1983)
- Affiliation: Carnegie Mellon University
- Education: Ph.D. at Duke University in 2012 (advisors: Christos Faloutsos)
- h-index 40
- Interests: anomaly detection, graph neural networks
- Awards: best research papers at PAKDD, SIAM SDD, ECML PKDD
I do a series of posts on the groups in graph research, previous post is here. The 27th is Leman Akoglu, a professor at the Carnegie Mellon University, with interests in detecting anomalies in graphs.
Leman Akoglu (~1983)
- Affiliation: Carnegie Mellon University
- Education: Ph.D. at Duke University in 2012 (advisors: Christos Faloutsos)
- h-index 40
- Interests: anomaly detection, graph neural networks
- Awards: best research papers at PAKDD, SIAM SDD, ECML PKDD
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Mingyuan Zhou
I do a series of posts on the groups in graph research, previous post is here. The 26th is Mingyuan Zhou, a professor at the University of Texas, who has been working on statistical aspects of GNNs. โฆ
I do a series of posts on the groups in graph research, previous post is here. The 26th is Mingyuan Zhou, a professor at the University of Texas, who has been working on statistical aspects of GNNs. โฆ
Graph Neural Networks in Computational Biology
Slides from Petar Veliฤkoviฤ about his journey on using machine learning algorithms on biological data.
Slides from Petar Veliฤkoviฤ about his journey on using machine learning algorithms on biological data.
Fresh picks from ArXiv
This week on ArXiv: accelerated inference for GNNs, graph MLP and graph-augmented sponsored search ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Learnable Online Graph Representations for 3D Multi-Object Tracking
* GMLP: Building Scalable and Flexible Graph Neural Networks with Feature-Message Passing
* Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks
Conferences
* Efficient Relation-aware Scoring Function Search for Knowledge Graph Embedding ICDE 2021
* Temporal Knowledge Graph Reasoning Based on Evolutional Representation Learning SIGIR 2021
* AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search SIGIR 2021
This week on ArXiv: accelerated inference for GNNs, graph MLP and graph-augmented sponsored search ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Learnable Online Graph Representations for 3D Multi-Object Tracking
* GMLP: Building Scalable and Flexible Graph Neural Networks with Feature-Message Passing
* Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks
Conferences
* Efficient Relation-aware Scoring Function Search for Knowledge Graph Embedding ICDE 2021
* Temporal Knowledge Graph Reasoning Based on Evolutional Representation Learning SIGIR 2021
* AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search SIGIR 2021
GNN User Group: meeting 4
Fourth meeting of GNN user group will include talks from me (Sergey Ivanov) where I will talk about combination of GBDT and GNNs, and professor Pan Li from Purdue University who will speak about constructing structural features to improve representations in temporal networks. Please join us on Thusday!
Fourth meeting of GNN user group will include talks from me (Sergey Ivanov) where I will talk about combination of GBDT and GNNs, and professor Pan Li from Purdue University who will speak about constructing structural features to improve representations in temporal networks. Please join us on Thusday!
Eventbrite
Graph Neural Networks User Group
Videos from WebConf 2021
Videos from WebConf 2021 are available here. Many graph talks on topics such as GNN, graph models, knowledge graphs, graph embeddings, link prediction, and more.
Videos from WebConf 2021 are available here. Many graph talks on topics such as GNN, graph models, knowledge graphs, graph embeddings, link prediction, and more.
videolectures.net
30. The Web Conference VIRTUAL EDITION, Ljubljana 2021
30. The Web Conference was to be held in Ljubljana, the capital of Slovenia, in the heart of Europe. Due to the current health problems worldwide, we should offer the best possible user experience as a completely virtual conference. We invite you to joinโฆ
Geometric Deep Learning Book
A new book by graph ML experts Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Veliฤkoviฤ on geometric deep learning is released. 156 pages on exploring symmetries that unifies different ML neural network architectures. An accompanying post nicely introduces the history of the geometry and its impact on the physics. It's exciting to see a categorization of many ML approaches from the perspective of the group theory.
A new book by graph ML experts Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Veliฤkoviฤ on geometric deep learning is released. 156 pages on exploring symmetries that unifies different ML neural network architectures. An accompanying post nicely introduces the history of the geometry and its impact on the physics. It's exciting to see a categorization of many ML approaches from the perspective of the group theory.
Medium
Geometric foundations of Deep Learning
Geometric Deep Learning is an attempt to unify a broad class of ML problems from the perspectives of symmetry and invariance.
Graph Representation Learning for Drug Discovery Slides
Slides from Jian Tang of the talk on de novo drug discovery and drug repurposing.
Slides from Jian Tang of the talk on de novo drug discovery and drug repurposing.
GNN User Group Meeting 4 video
Video from the 4th meeting of GNN user group that includes talk from me (on GBDT+GNN model) and professor Pan Li on causal anonymous walks for temporal graphs. Slides can be found on DGL slack channel.
Video from the 4th meeting of GNN user group that includes talk from me (on GBDT+GNN model) and professor Pan Li on causal anonymous walks for temporal graphs. Slides can be found on DGL slack channel.
YouTube
Graph Neural Networks User Group Meeting on April 29, 2021
Agenda 4/29/2021:
4:00 - 4:30 (PST): Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia).
4:30 - 5:00 (PST): Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Liโฆ
4:00 - 4:30 (PST): Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia).
4:30 - 5:00 (PST): Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Liโฆ
Invariant and equivariant layers with applications to GNN, PointNet and Transformers
A blog post by Marc Lelarge about invariant and equivariant functions and their relation to the universality and expressivity of GNN. As the main result they show that any invariant/equivariant function on n points can be represented as a sum of functions on each point independently.
A blog post by Marc Lelarge about invariant and equivariant functions and their relation to the universality and expressivity of GNN. As the main result they show that any invariant/equivariant function on n points can be represented as a sum of functions on each point independently.
Fresh picks from ArXiv
This week on ArXiv: knowledge graph in production at Alibaba, links between WL and GNN, and more robust classifier with energy-based view ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* Billion-scale Pre-trained E-commerce Product Knowledge Graph Model ICDE 2021
* Improving Conversational Recommendation System by Pretraining on Billions Scale of Knowledge Graph ICDE 2021
* Fast Multiscale Diffusion on Graphs
* An Energy-Based View of Graph Neural Networks
* Structure-Aware Hierarchical Graph Pooling using Information Bottleneck
* Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation
* Unified Spatio-Temporal Modeling for Traffic Forecasting using Graph Neural Network
Survey
* The Logic of Graph Neural Networks with Martin Grohe
* Graph Vulnerability and Robustness: A Survey
* Graph Learning: A Survey
* Graph Neural Networks for Traffic Forecasting
This week on ArXiv: knowledge graph in production at Alibaba, links between WL and GNN, and more robust classifier with energy-based view ๐
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* Billion-scale Pre-trained E-commerce Product Knowledge Graph Model ICDE 2021
* Improving Conversational Recommendation System by Pretraining on Billions Scale of Knowledge Graph ICDE 2021
* Fast Multiscale Diffusion on Graphs
* An Energy-Based View of Graph Neural Networks
* Structure-Aware Hierarchical Graph Pooling using Information Bottleneck
* Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation
* Unified Spatio-Temporal Modeling for Traffic Forecasting using Graph Neural Network
Survey
* The Logic of Graph Neural Networks with Martin Grohe
* Graph Vulnerability and Robustness: A Survey
* Graph Learning: A Survey
* Graph Neural Networks for Traffic Forecasting
Video: Workshop of Graph Neural Networks and Systems (GNNSys'21)
Very interesting videos from the workshop at MLSys 21 on GNNs in the industry. The talks include topics such as GNNs on graphcore's IPU, chip placement optimization, particle reconstruction at the large hadron collider and more.
Very interesting videos from the workshop at MLSys 21 on GNNs in the industry. The talks include topics such as GNNs on graphcore's IPU, chip placement optimization, particle reconstruction at the large hadron collider and more.
SlidesLive
MLSys 2021 | Workshop of Graph Neural Networks and Systems (GNNSys'21)
The Conference on Machine Learning and Systems targets research at the intersection of machine learning and systems. The conference aims to elicit new connections amongst these fields, including identifying best practices and design principles for learningโฆ
GML In-Depth: three forms of self-supervised learning
My new in-depth newsletter on self-supervised learning with applications to graphs. There is an upcoming keynote talk from Alexei Efros at ICLR'21 about self-supervised learning and I was inspired by the motivations that he talks there. In particular, he explains that self-supervised learning is a way to reduce the role of humans in designing ML pipelines, which would allow neural nets to learn in a similar way as humans do. Self-supervised learning for graphs is an active area of research and there are good reasons for this: for applications such as drug or catalyst discovery, there are billions of unlabeled graphs from which we would like to extract as much relevant information as possible. So self-supervised learning is becoming a new paradigm for learning such useful representations.
My new in-depth newsletter on self-supervised learning with applications to graphs. There is an upcoming keynote talk from Alexei Efros at ICLR'21 about self-supervised learning and I was inspired by the motivations that he talks there. In particular, he explains that self-supervised learning is a way to reduce the role of humans in designing ML pipelines, which would allow neural nets to learn in a similar way as humans do. Self-supervised learning for graphs is an active area of research and there are good reasons for this: for applications such as drug or catalyst discovery, there are billions of unlabeled graphs from which we would like to extract as much relevant information as possible. So self-supervised learning is becoming a new paradigm for learning such useful representations.
Graph Machine Learning
GML In-Depth: three forms of self-supervised learning
"Excelling at chess has long been considered a symbol of more general intelligence. That is an incorrect assumption in my view, as pleasant as it might be." Garry Kasparov
Knowledge Graphs @ ICLR 2021
One and only Michael Galkin does it again with a superior digest of knowledge graph research at ICLR 2021. Topics include reasoning, temporal logics, and complex question answering in KGs: a lot of novel ideas and less SOTA-chasing work!
One and only Michael Galkin does it again with a superior digest of knowledge graph research at ICLR 2021. Topics include reasoning, temporal logics, and complex question answering in KGs: a lot of novel ideas and less SOTA-chasing work!
Medium
Knowledge Graphs @ ICLR 2021
Your guide to the KG-related research in ML, May edition
Fresh picks from ArXiv
This week on ArXiv: optimization properties of GNNs, review on sample-based approaches, and time zigzags for Ethereum price prediction ๐ฐ
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
* Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders ACL 2021
* Neural Graph Matching based Collaborative Filtering SIGIR 2021
* Z-GCNETs: Time Zigzags at Graph Convolutional Networks for Time Series Forecasting ICML 2021
* Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth ICML 2021
Efficiency
* Scalable Graph Neural Network Training: The Case for Sampling
* VersaGNN: a Versatile accelerator for Graph neural networks
This week on ArXiv: optimization properties of GNNs, review on sample-based approaches, and time zigzags for Ethereum price prediction ๐ฐ
If I forgot to mention your paper, please shoot me a message and I will update the post.
Conferences
* Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders ACL 2021
* Neural Graph Matching based Collaborative Filtering SIGIR 2021
* Z-GCNETs: Time Zigzags at Graph Convolutional Networks for Time Series Forecasting ICML 2021
* Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth ICML 2021
Efficiency
* Scalable Graph Neural Network Training: The Case for Sampling
* VersaGNN: a Versatile accelerator for Graph neural networks
New Proof Reveals That Graphs With No Pentagons Are Fundamentally Different
A new article at Quanta about ErdลsโHajnal conjecture, which states that any graph that forbids having some small subgraph will inevitably have a large clique or a large independent set. The article talks about a recent paper that confirms the conjecture for a special case which was deemed the hardest. Now there is a hope that the conjecture is true for the general case.
A new article at Quanta about ErdลsโHajnal conjecture, which states that any graph that forbids having some small subgraph will inevitably have a large clique or a large independent set. The article talks about a recent paper that confirms the conjecture for a special case which was deemed the hardest. Now there is a hope that the conjecture is true for the general case.
Quanta Magazine
New Proof Reveals That Graphs With No Pentagons Are Fundamentally Different
Researchers have proved a special case of the Erdลs-Hajnal conjecture, which shows what happens in graphs that exclude anything resembling a pentagon.
PhD Thesis on Graph Machine Learning
Here are some PhD dissertations on GML. Part 4 (previous here).
Adji Bousso Dieng: Deep Probabilistic Graphical Modeling (Columbia University 2020)
Dai Quoc Nguyen: Representation Learning for Graph-Structured Data (Monash University 2021)
Matteo Tiezzi: Local Propagation in Neural Network Learning by Architectural Constraints (Universitร degli Studi di Siena 2021)
Here are some PhD dissertations on GML. Part 4 (previous here).
Adji Bousso Dieng: Deep Probabilistic Graphical Modeling (Columbia University 2020)
Dai Quoc Nguyen: Representation Learning for Graph-Structured Data (Monash University 2021)
Matteo Tiezzi: Local Propagation in Neural Network Learning by Architectural Constraints (Universitร degli Studi di Siena 2021)
Telegram
Graph Machine Learning
PhD Thesis on Graph Machine Learning
Here are some PhD dissertations on GML. Part 3 (previous here).
Xiaowen Dong: Multi-view signal processing and learning on graphs (EPFL 2014)
Yan Leng: Collective behavior over social networks with data-driven andโฆ
Here are some PhD dissertations on GML. Part 3 (previous here).
Xiaowen Dong: Multi-view signal processing and learning on graphs (EPFL 2014)
Yan Leng: Collective behavior over social networks with data-driven andโฆ