Mathematicians Settle Erdős Coloring Conjecture
Erdős-Faber-Lovász conjecture states that the minimum number of colors necessary to shade the edges of a hypergraphs so that no overlapping edges have the same color is bounded by the number of vertices. After 50 years of research it has finally been resolved.
Erdős-Faber-Lovász conjecture states that the minimum number of colors necessary to shade the edges of a hypergraphs so that no overlapping edges have the same color is bounded by the number of vertices. After 50 years of research it has finally been resolved.
Quanta Magazine
Mathematicians Settle Erdős Coloring Conjecture
Fifty years ago, Paul Erdős and two other mathematicians came up with a graph theory problem that they thought they might solve on the spot. A team of mathematicians has finally settled it.
Fresh picks from ArXiv
This week on ArXiv: improved power of GNNs, new autoML library for graphs, and decreasing query time for graph traversal 🕔
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Theoretically Improving Graph Neural Networks via Anonymous Walk Graph Kernels
* A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs
* Learning to Coordinate via Multiple Graph Neural Networks
* DyGCN: Dynamic Graph Embedding with Graph Convolutional Network
* Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs with Philip S. Yu
* The World as a Graph: Improving El Niño Forecasts with Graph Neural Networks
kNN
* Graph Reordering for Cache-Efficient Near Neighbor Search with Alex Smola
Software
* AutoGL: A Library for Automated Graph Learning
Survey
* Representation Learning for Networks in Biology and Medicine: Advancements, Challenges, and Opportunities with Marinka Zitnik
This week on ArXiv: improved power of GNNs, new autoML library for graphs, and decreasing query time for graph traversal 🕔
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Theoretically Improving Graph Neural Networks via Anonymous Walk Graph Kernels
* A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs
* Learning to Coordinate via Multiple Graph Neural Networks
* DyGCN: Dynamic Graph Embedding with Graph Convolutional Network
* Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs with Philip S. Yu
* The World as a Graph: Improving El Niño Forecasts with Graph Neural Networks
kNN
* Graph Reordering for Cache-Efficient Near Neighbor Search with Alex Smola
Software
* AutoGL: A Library for Automated Graph Learning
Survey
* Representation Learning for Networks in Biology and Medicine: Advancements, Challenges, and Opportunities with Marinka Zitnik
Outlier detection and description workshop at KDD 2021
Graph methods are very popular in detecting fraud as they are capable to distinguish interactions of fraudsters from benign users. There is a big workshop at KDD 2021 about detecting and describing outliers, with a great list of keynote speakers.
Graph methods are very popular in detecting fraud as they are capable to distinguish interactions of fraudsters from benign users. There is a big workshop at KDD 2021 about detecting and describing outliers, with a great list of keynote speakers.
oddworkshop.github.io
ODD 2021 - 6th Outlier Detection and Description Workshop
ODD 2021, 6th Outlier Detection and Description Workshop, co-located with KDD 2021, Virtual
Weisfeiler and Lehman Go Topological: Message Passing Simplical Networks
A video presentation (and slides) by Cristian Bodnar & Fabrizio Frasca on a new type of GNNs that defines neighborhoods based on the simplical complexes of a graph. It goes quite deep into the theory with the supporting experiments in graph isomorphism, graph classification, and trajectory disambiguation.
A video presentation (and slides) by Cristian Bodnar & Fabrizio Frasca on a new type of GNNs that defines neighborhoods based on the simplical complexes of a graph. It goes quite deep into the theory with the supporting experiments in graph isomorphism, graph classification, and trajectory disambiguation.
Videos from CS224W
A legendary Stanford CS224W course on graph ML now releases videos on YouTube for 2021. Promised to be 2 lectures each week. Slides available on the site too (homeworks are still missing).
A legendary Stanford CS224W course on graph ML now releases videos on YouTube for 2021. Promised to be 2 lectures each week. Slides available on the site too (homeworks are still missing).
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 1.1 - Why Graphs
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Bu1w3n
Jure Leskovec
Computer Science, PhD
Graphs are a general language for describing and analyzing entities with relations/interactions.…
Jure Leskovec
Computer Science, PhD
Graphs are a general language for describing and analyzing entities with relations/interactions.…
Fresh picks from ArXiv
This week on ArXiv: equivalence of graph matching and GED, hyperbolic GNNs, and de-anon of blockchain 💲
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks with Philip S. Yu
* Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks with Philip S. Yu
* Generative Causal Explanations for Graph Neural Networks
* DistGNN: Scalable Distributed Training for Large-Scale Graph Neural Networks
* Identity Inference on Blockchain using Graph Neural Network
Conferences
* MEG: Generating Molecular Counterfactual Explanations for Deep Graph Networks IJCNN 2021
* FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks MLSys Workshop 2021
* Search to aggregate neighborhood for graph neural network ICDE 2021
Hyperbolic
* A Hyperbolic-to-Hyperbolic Graph Convolutional Network
* Lorentzian Graph Convolutional Networks
Math
* On the unification of the graph edit distance and graph matching problems
This week on ArXiv: equivalence of graph matching and GED, hyperbolic GNNs, and de-anon of blockchain 💲
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks with Philip S. Yu
* Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks with Philip S. Yu
* Generative Causal Explanations for Graph Neural Networks
* DistGNN: Scalable Distributed Training for Large-Scale Graph Neural Networks
* Identity Inference on Blockchain using Graph Neural Network
Conferences
* MEG: Generating Molecular Counterfactual Explanations for Deep Graph Networks IJCNN 2021
* FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks MLSys Workshop 2021
* Search to aggregate neighborhood for graph neural network ICDE 2021
Hyperbolic
* A Hyperbolic-to-Hyperbolic Graph Convolutional Network
* Lorentzian Graph Convolutional Networks
Math
* On the unification of the graph edit distance and graph matching problems
Self-supervised learning of GNNs
Self-supervised learning (SSL) is a paradigm of learning when we have large amounts unlabeled data and we want to get representation of the input which we can use later for the downstream tasks. The difference between unsupervised and self-supervised learning is that unsupervised learning attempts to learn a representation on a single input, while SSL assumes there is a model trained across several inputs.
Examples of unsupervised learning on graphs is graph kernels that boil down to counting some statistics on graphs (e.g. motifs) which would represent a graph. Examples of SSL is when you first create multiple views of the same graph (e.g. by permuting the edges) and then train a model to distinguish views of different graphs. DeepWalk, node2vec and other pre-GNN node embeddings are somewhere in between: they are usually applied to a single graph, but the concept could be well applied to learning representations on many graphs as well.
There is a recent boom in this area for graphs, so there are some fresh surveys available (here and here) as well as the awesome list of SSL-GNNs.
Self-supervised learning (SSL) is a paradigm of learning when we have large amounts unlabeled data and we want to get representation of the input which we can use later for the downstream tasks. The difference between unsupervised and self-supervised learning is that unsupervised learning attempts to learn a representation on a single input, while SSL assumes there is a model trained across several inputs.
Examples of unsupervised learning on graphs is graph kernels that boil down to counting some statistics on graphs (e.g. motifs) which would represent a graph. Examples of SSL is when you first create multiple views of the same graph (e.g. by permuting the edges) and then train a model to distinguish views of different graphs. DeepWalk, node2vec and other pre-GNN node embeddings are somewhere in between: they are usually applied to a single graph, but the concept could be well applied to learning representations on many graphs as well.
There is a recent boom in this area for graphs, so there are some fresh surveys available (here and here) as well as the awesome list of SSL-GNNs.
Awesome graph repos
Collections of methods and papers for specific graph topics.
Graph-based Deep Learning Literature — Links to Conference Publications and the top 10 most-cited publications, Related workshops, Surveys / Literature Reviews / Books in graph-based deep learning.
awesome-graph-classification — A collection of graph classification methods, covering embedding, deep learning, graph kernel and factorization papers with reference implementations.
Awesome-Graph-Neural-Networks — A collection of resources related with graph neural networks..
awesome-graph — A curated list of resources for graph databases and graph computing tools
awesome-knowledge-graph — A curated list of Knowledge Graph related learning materials, databases, tools and other resources.
awesome-knowledge-graph — A curated list of awesome knowledge graph tutorials, projects and communities.
Awesome-GNN-Recommendation — graph mining for recommender systems.
awesome-graph-attack-papers — links to works about adversarial attacks and defenses on graph data or GNNs.
Graph-Adversarial-Learning — Attack-related papers, Defense-related papers, Robustness Certification papers, etc., ranging from 2017 to 2021.
awesome-self-supervised-gnn — Papers about self-supervised learning on GNNs.
awesome-self-supervised-learning-for-graphs — A curated list for awesome self-supervised graph representation learning resources.
Awesome-Graph-Contrastive-Learning — Collection of resources related with Graph Contrastive Learning.
Collections of methods and papers for specific graph topics.
Graph-based Deep Learning Literature — Links to Conference Publications and the top 10 most-cited publications, Related workshops, Surveys / Literature Reviews / Books in graph-based deep learning.
awesome-graph-classification — A collection of graph classification methods, covering embedding, deep learning, graph kernel and factorization papers with reference implementations.
Awesome-Graph-Neural-Networks — A collection of resources related with graph neural networks..
awesome-graph — A curated list of resources for graph databases and graph computing tools
awesome-knowledge-graph — A curated list of Knowledge Graph related learning materials, databases, tools and other resources.
awesome-knowledge-graph — A curated list of awesome knowledge graph tutorials, projects and communities.
Awesome-GNN-Recommendation — graph mining for recommender systems.
awesome-graph-attack-papers — links to works about adversarial attacks and defenses on graph data or GNNs.
Graph-Adversarial-Learning — Attack-related papers, Defense-related papers, Robustness Certification papers, etc., ranging from 2017 to 2021.
awesome-self-supervised-gnn — Papers about self-supervised learning on GNNs.
awesome-self-supervised-learning-for-graphs — A curated list for awesome self-supervised graph representation learning resources.
Awesome-Graph-Contrastive-Learning — Collection of resources related with Graph Contrastive Learning.
GitHub
GitHub - naganandy/graph-based-deep-learning-literature: links to conference publications in graph-based deep learning
links to conference publications in graph-based deep learning - naganandy/graph-based-deep-learning-literature
Graph Machine Learning research groups: Leman Akoglu
I do a series of posts on the groups in graph research, previous post is here. The 27th is Leman Akoglu, a professor at the Carnegie Mellon University, with interests in detecting anomalies in graphs.
Leman Akoglu (~1983)
- Affiliation: Carnegie Mellon University
- Education: Ph.D. at Duke University in 2012 (advisors: Christos Faloutsos)
- h-index 40
- Interests: anomaly detection, graph neural networks
- Awards: best research papers at PAKDD, SIAM SDD, ECML PKDD
I do a series of posts on the groups in graph research, previous post is here. The 27th is Leman Akoglu, a professor at the Carnegie Mellon University, with interests in detecting anomalies in graphs.
Leman Akoglu (~1983)
- Affiliation: Carnegie Mellon University
- Education: Ph.D. at Duke University in 2012 (advisors: Christos Faloutsos)
- h-index 40
- Interests: anomaly detection, graph neural networks
- Awards: best research papers at PAKDD, SIAM SDD, ECML PKDD
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Mingyuan Zhou
I do a series of posts on the groups in graph research, previous post is here. The 26th is Mingyuan Zhou, a professor at the University of Texas, who has been working on statistical aspects of GNNs. …
I do a series of posts on the groups in graph research, previous post is here. The 26th is Mingyuan Zhou, a professor at the University of Texas, who has been working on statistical aspects of GNNs. …
Graph Neural Networks in Computational Biology
Slides from Petar Veličković about his journey on using machine learning algorithms on biological data.
Slides from Petar Veličković about his journey on using machine learning algorithms on biological data.
Fresh picks from ArXiv
This week on ArXiv: accelerated inference for GNNs, graph MLP and graph-augmented sponsored search 🔍
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Learnable Online Graph Representations for 3D Multi-Object Tracking
* GMLP: Building Scalable and Flexible Graph Neural Networks with Feature-Message Passing
* Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks
Conferences
* Efficient Relation-aware Scoring Function Search for Knowledge Graph Embedding ICDE 2021
* Temporal Knowledge Graph Reasoning Based on Evolutional Representation Learning SIGIR 2021
* AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search SIGIR 2021
This week on ArXiv: accelerated inference for GNNs, graph MLP and graph-augmented sponsored search 🔍
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Learnable Online Graph Representations for 3D Multi-Object Tracking
* GMLP: Building Scalable and Flexible Graph Neural Networks with Feature-Message Passing
* Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks
Conferences
* Efficient Relation-aware Scoring Function Search for Knowledge Graph Embedding ICDE 2021
* Temporal Knowledge Graph Reasoning Based on Evolutional Representation Learning SIGIR 2021
* AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search SIGIR 2021
GNN User Group: meeting 4
Fourth meeting of GNN user group will include talks from me (Sergey Ivanov) where I will talk about combination of GBDT and GNNs, and professor Pan Li from Purdue University who will speak about constructing structural features to improve representations in temporal networks. Please join us on Thusday!
Fourth meeting of GNN user group will include talks from me (Sergey Ivanov) where I will talk about combination of GBDT and GNNs, and professor Pan Li from Purdue University who will speak about constructing structural features to improve representations in temporal networks. Please join us on Thusday!
Eventbrite
Graph Neural Networks User Group
Videos from WebConf 2021
Videos from WebConf 2021 are available here. Many graph talks on topics such as GNN, graph models, knowledge graphs, graph embeddings, link prediction, and more.
Videos from WebConf 2021 are available here. Many graph talks on topics such as GNN, graph models, knowledge graphs, graph embeddings, link prediction, and more.
videolectures.net
30. The Web Conference VIRTUAL EDITION, Ljubljana 2021
30. The Web Conference was to be held in Ljubljana, the capital of Slovenia, in the heart of Europe. Due to the current health problems worldwide, we should offer the best possible user experience as a completely virtual conference. We invite you to join…
Geometric Deep Learning Book
A new book by graph ML experts Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Veličković on geometric deep learning is released. 156 pages on exploring symmetries that unifies different ML neural network architectures. An accompanying post nicely introduces the history of the geometry and its impact on the physics. It's exciting to see a categorization of many ML approaches from the perspective of the group theory.
A new book by graph ML experts Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Veličković on geometric deep learning is released. 156 pages on exploring symmetries that unifies different ML neural network architectures. An accompanying post nicely introduces the history of the geometry and its impact on the physics. It's exciting to see a categorization of many ML approaches from the perspective of the group theory.
Medium
Geometric foundations of Deep Learning
Geometric Deep Learning is an attempt to unify a broad class of ML problems from the perspectives of symmetry and invariance.
Graph Representation Learning for Drug Discovery Slides
Slides from Jian Tang of the talk on de novo drug discovery and drug repurposing.
Slides from Jian Tang of the talk on de novo drug discovery and drug repurposing.
GNN User Group Meeting 4 video
Video from the 4th meeting of GNN user group that includes talk from me (on GBDT+GNN model) and professor Pan Li on causal anonymous walks for temporal graphs. Slides can be found on DGL slack channel.
Video from the 4th meeting of GNN user group that includes talk from me (on GBDT+GNN model) and professor Pan Li on causal anonymous walks for temporal graphs. Slides can be found on DGL slack channel.
YouTube
Graph Neural Networks User Group Meeting on April 29, 2021
Agenda 4/29/2021:
4:00 - 4:30 (PST): Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia).
4:30 - 5:00 (PST): Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Li…
4:00 - 4:30 (PST): Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia).
4:30 - 5:00 (PST): Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Li…
Invariant and equivariant layers with applications to GNN, PointNet and Transformers
A blog post by Marc Lelarge about invariant and equivariant functions and their relation to the universality and expressivity of GNN. As the main result they show that any invariant/equivariant function on n points can be represented as a sum of functions on each point independently.
A blog post by Marc Lelarge about invariant and equivariant functions and their relation to the universality and expressivity of GNN. As the main result they show that any invariant/equivariant function on n points can be represented as a sum of functions on each point independently.
Fresh picks from ArXiv
This week on ArXiv: knowledge graph in production at Alibaba, links between WL and GNN, and more robust classifier with energy-based view 🔋
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* Billion-scale Pre-trained E-commerce Product Knowledge Graph Model ICDE 2021
* Improving Conversational Recommendation System by Pretraining on Billions Scale of Knowledge Graph ICDE 2021
* Fast Multiscale Diffusion on Graphs
* An Energy-Based View of Graph Neural Networks
* Structure-Aware Hierarchical Graph Pooling using Information Bottleneck
* Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation
* Unified Spatio-Temporal Modeling for Traffic Forecasting using Graph Neural Network
Survey
* The Logic of Graph Neural Networks with Martin Grohe
* Graph Vulnerability and Robustness: A Survey
* Graph Learning: A Survey
* Graph Neural Networks for Traffic Forecasting
This week on ArXiv: knowledge graph in production at Alibaba, links between WL and GNN, and more robust classifier with energy-based view 🔋
If I forgot to mention your paper, please shoot me a message and I will update the post.
Applications
* Billion-scale Pre-trained E-commerce Product Knowledge Graph Model ICDE 2021
* Improving Conversational Recommendation System by Pretraining on Billions Scale of Knowledge Graph ICDE 2021
* Fast Multiscale Diffusion on Graphs
* An Energy-Based View of Graph Neural Networks
* Structure-Aware Hierarchical Graph Pooling using Information Bottleneck
* Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation
* Unified Spatio-Temporal Modeling for Traffic Forecasting using Graph Neural Network
Survey
* The Logic of Graph Neural Networks with Martin Grohe
* Graph Vulnerability and Robustness: A Survey
* Graph Learning: A Survey
* Graph Neural Networks for Traffic Forecasting
Video: Workshop of Graph Neural Networks and Systems (GNNSys'21)
Very interesting videos from the workshop at MLSys 21 on GNNs in the industry. The talks include topics such as GNNs on graphcore's IPU, chip placement optimization, particle reconstruction at the large hadron collider and more.
Very interesting videos from the workshop at MLSys 21 on GNNs in the industry. The talks include topics such as GNNs on graphcore's IPU, chip placement optimization, particle reconstruction at the large hadron collider and more.
SlidesLive
MLSys 2021 | Workshop of Graph Neural Networks and Systems (GNNSys'21)
The Conference on Machine Learning and Systems targets research at the intersection of machine learning and systems. The conference aims to elicit new connections amongst these fields, including identifying best practices and design principles for learning…
GML In-Depth: three forms of self-supervised learning
My new in-depth newsletter on self-supervised learning with applications to graphs. There is an upcoming keynote talk from Alexei Efros at ICLR'21 about self-supervised learning and I was inspired by the motivations that he talks there. In particular, he explains that self-supervised learning is a way to reduce the role of humans in designing ML pipelines, which would allow neural nets to learn in a similar way as humans do. Self-supervised learning for graphs is an active area of research and there are good reasons for this: for applications such as drug or catalyst discovery, there are billions of unlabeled graphs from which we would like to extract as much relevant information as possible. So self-supervised learning is becoming a new paradigm for learning such useful representations.
My new in-depth newsletter on self-supervised learning with applications to graphs. There is an upcoming keynote talk from Alexei Efros at ICLR'21 about self-supervised learning and I was inspired by the motivations that he talks there. In particular, he explains that self-supervised learning is a way to reduce the role of humans in designing ML pipelines, which would allow neural nets to learn in a similar way as humans do. Self-supervised learning for graphs is an active area of research and there are good reasons for this: for applications such as drug or catalyst discovery, there are billions of unlabeled graphs from which we would like to extract as much relevant information as possible. So self-supervised learning is becoming a new paradigm for learning such useful representations.
Graph Machine Learning
GML In-Depth: three forms of self-supervised learning
"Excelling at chess has long been considered a symbol of more general intelligence. That is an incorrect assumption in my view, as pleasant as it might be." Garry Kasparov