Node regression problem
I asked on twitter what the available node regression data sets there and found quite a few interesting responses.
1. There are pure node regression data sets, but not so many. One can use Wikipedia, Pokec, or data sets from this paper. I hope to release a couple more data sets like these soon.
2. You can also find data sets in spatiotemporal prediction on graphs (eg. traffic forecasting). You are given graph + velocity on each lane and you are asked to predict velocity in the future. My opinion is that the problem is a toy problem: there are no features associated with the nodes (except for a speed). But you can take a look at DCRNN, STGCN, GaAN, Graph WaveNet, STGRAT, etc. models that deal with that.
3. You can find node regression in the work of simulating physics. A node is a particle, it has a few features (eg. position+velocity) you are asked to predict acceleration. This is an interesting problem, but I haven't found data sets. You probably need to write your own simulator.
4. Next scene prediction. Essentially the same as previous, but the objects can be anything: for example, a camera view in a self-driving car. You are asked to predict next position of every object. I don't know if anyone tried to solve this problem.
5. Action prediction for RL agent. NerveNet did it. Each object is a graph and you predict an action for each node.
I asked on twitter what the available node regression data sets there and found quite a few interesting responses.
1. There are pure node regression data sets, but not so many. One can use Wikipedia, Pokec, or data sets from this paper. I hope to release a couple more data sets like these soon.
2. You can also find data sets in spatiotemporal prediction on graphs (eg. traffic forecasting). You are given graph + velocity on each lane and you are asked to predict velocity in the future. My opinion is that the problem is a toy problem: there are no features associated with the nodes (except for a speed). But you can take a look at DCRNN, STGCN, GaAN, Graph WaveNet, STGRAT, etc. models that deal with that.
3. You can find node regression in the work of simulating physics. A node is a particle, it has a few features (eg. position+velocity) you are asked to predict acceleration. This is an interesting problem, but I haven't found data sets. You probably need to write your own simulator.
4. Next scene prediction. Essentially the same as previous, but the objects can be anything: for example, a camera view in a self-driving car. You are asked to predict next position of every object. I don't know if anyone tried to solve this problem.
5. Action prediction for RL agent. NerveNet did it. Each object is a graph and you predict an action for each node.
Twitter
Sergey Ivanov
There are many data sets for graph classification/regression and node classification. Anyone knows if there are *any* node *regression* data sets?
Brief analytics of KDD papers
Here are some plots for upcoming KDD 2020 (only research track). It's interesting to compare it against ICML 2020. You can check out git repo with the analysis. Here is highlights:
1. Top affiliations at KDD are different than at ICML, with several "small" names at the top.
2. Leading authors are almost all from China.
3. There are more authors per paper (~4-5) at KDD than at ICML (~3-4). Only a single paper with a single author.
4. There are ~65 graph papers, with a handful batch on pure algorithms.
5. In total there are 217 research papers. Graphs comprise about 30% of all papers.
6. Wordcloud confirms: graphs are the most used word in titles.
7. "Geodesic Forests" is the shortest title appeared.
Here are some plots for upcoming KDD 2020 (only research track). It's interesting to compare it against ICML 2020. You can check out git repo with the analysis. Here is highlights:
1. Top affiliations at KDD are different than at ICML, with several "small" names at the top.
2. Leading authors are almost all from China.
3. There are more authors per paper (~4-5) at KDD than at ICML (~3-4). Only a single paper with a single author.
4. There are ~65 graph papers, with a handful batch on pure algorithms.
5. In total there are 217 research papers. Graphs comprise about 30% of all papers.
6. Wordcloud confirms: graphs are the most used word in titles.
7. "Geodesic Forests" is the shortest title appeared.
Medium
ICML 2020. Comprehensive analysis of authors, organizations, and countries.
Who published the most?
Symposium on Graph Drawing and Network Visualization
It's cool to see there is a small conference on how you can visualize graphs. Registration is free until 10 September.
It's cool to see there is a small conference on how you can visualize graphs. Registration is free until 10 September.
gd2020.cs.ubc.ca
Graph Drawing 2020 | Graph Drawing 2020 Conference
The 28th International Symposium on Graph Drawing and Network Visualization will be held online from September 16 to 18 due to the uncertainty regarding future travel and physical meeting restrictions caused by the Corona virus outbreak. We plan to keep…
KDD 2020 Highlights
I haven't found highlights about KDD 2020, so did my own. What's interesting there are many papers on scalability of GNNs, intersection of graphs and recommendation, and clustering algorithms. Paper digest allows you to browse quickly through the papers.
I haven't found highlights about KDD 2020, so did my own. What's interesting there are many papers on scalability of GNNs, intersection of graphs and recommendation, and clustering algorithms. Paper digest allows you to browse quickly through the papers.
Medium
KDD-2020 Highlights
Let’s take a look at some of the highlights of this year’s KDD — one of the biggest applied research conferences in Computer Science.
Fresh picks from ArXiv
This week in ArXiv new graph data sets for language recommendation, new optimization methods of GNNs, and applications to RL 🕹
GNN
• Optimization of Graph Neural Networks with Natural Gradient Descent
• Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks
Data sets
• VisualSem: a high-quality knowledge graph for vision and language
• COOKIE: A Dataset for Conversational Recommendation over Knowledge Graphs in E-commerce
Applications
• Multi-Agent Reinforcement Learning with Graph Clustering
• SF-GRASS: Solver-Free Graph Spectral Sparsification
• Learning Graph Edit Distance by Graph Neural Networks
• Community-Aware Graph Signal Processing
This week in ArXiv new graph data sets for language recommendation, new optimization methods of GNNs, and applications to RL 🕹
GNN
• Optimization of Graph Neural Networks with Natural Gradient Descent
• Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks
Data sets
• VisualSem: a high-quality knowledge graph for vision and language
• COOKIE: A Dataset for Conversational Recommendation over Knowledge Graphs in E-commerce
Applications
• Multi-Agent Reinforcement Learning with Graph Clustering
• SF-GRASS: Solver-Free Graph Spectral Sparsification
• Learning Graph Edit Distance by Graph Neural Networks
• Community-Aware Graph Signal Processing
Graph Machine Learning Books
For a long time I was thinking that the community lacks proper books on graph machine learning and even thought maybe I should write one. But luckily there are other active people. With the difference of one day 2 (!) books were announced.
Graph Representation Learning Book by Will Hamilton, which so far has 3 main chapters on node embeddings, GNNs, and generative models. While the drafts are ready, there is still a long way to make it comprehensive book and the author promises to work on that. Great start.
Deep Learning on Graphs by Yao Ma and Jiliang Tang. This should be available next month and should focus on foundations of GNNs as well as applications.
That's great, hopefully they will become handbooks for those who want to start in this area. Now waiting the same but for educational courses 🙏
For a long time I was thinking that the community lacks proper books on graph machine learning and even thought maybe I should write one. But luckily there are other active people. With the difference of one day 2 (!) books were announced.
Graph Representation Learning Book by Will Hamilton, which so far has 3 main chapters on node embeddings, GNNs, and generative models. While the drafts are ready, there is still a long way to make it comprehensive book and the author promises to work on that. Great start.
Deep Learning on Graphs by Yao Ma and Jiliang Tang. This should be available next month and should focus on foundations of GNNs as well as applications.
That's great, hopefully they will become handbooks for those who want to start in this area. Now waiting the same but for educational courses 🙏
Mining and Learning with Graphs Workshop
MLG workshop is a regular workshop on various ML solutions for graphs. The videos for each poster can be found here. Keynotes should be available soon (except for Danai Koutra, which is available now).
MLG workshop is a regular workshop on various ML solutions for graphs. The videos for each poster can be found here. Keynotes should be available soon (except for Danai Koutra, which is available now).
Graph Machine Learning research groups: Pietro Liò
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liò, a computational biologist and a supervisor of Petar Veličković. He has also been very active in GML recently (with 54 papers in 2020) so he could be a good choice if you want to do a PhD in this area.
Pietro Liò (~1965)
- Affiliation: University of Cambridge
- Education: Ph.D. in Theoretical Genetics at University of Firenze, Italy in 1995 and Ph.D. in Engineering at University of Pavia, Italy in 2007;
- h-index: 50;
- Awards: Lagrange Fellowship, best papers at ISEM, MCED, FET;
- Interests: graph neural networks, computational biology, signal processing.
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liò, a computational biologist and a supervisor of Petar Veličković. He has also been very active in GML recently (with 54 papers in 2020) so he could be a good choice if you want to do a PhD in this area.
Pietro Liò (~1965)
- Affiliation: University of Cambridge
- Education: Ph.D. in Theoretical Genetics at University of Firenze, Italy in 1995 and Ph.D. in Engineering at University of Pavia, Italy in 2007;
- h-index: 50;
- Awards: Lagrange Fellowship, best papers at ISEM, MCED, FET;
- Interests: graph neural networks, computational biology, signal processing.
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Xavier Bresson
I do a series of posts on the groups in graph research, previous post is here. The 12th is Xavier Bresson, conference and tutorial organizers on graph machine learning.
Xavier Bresson (~1975)
- Affiliation:…
I do a series of posts on the groups in graph research, previous post is here. The 12th is Xavier Bresson, conference and tutorial organizers on graph machine learning.
Xavier Bresson (~1975)
- Affiliation:…
JuliaCon2020 Graph Videos
While Python is a default language for analyzing graphs, there are numerous other languages that provide packages for dealing with graphs. In the recent JuliaCon, devoted to a programming language Julia, many talks were about new graph packages with applications to transportation networks, dynamical systems, geometric deep learning, knowledge graphs, and others. Check out the full program here.
While Python is a default language for analyzing graphs, there are numerous other languages that provide packages for dealing with graphs. In the recent JuliaCon, devoted to a programming language Julia, many talks were about new graph packages with applications to transportation networks, dynamical systems, geometric deep learning, knowledge graphs, and others. Check out the full program here.
julialang.org
The Julia Programming Language
The official website for the Julia Language. Julia is a language that is fast, dynamic, easy to use, and open source. Click here to learn more.
Fresh picks from ArXiv
This week ArXiv presents papers on visualization of graphs, robustness certificates, and a survey on combinatorial optimization ♟
GNN
• All About Knowledge Graphs for Actions
• The Effectiveness of Interactive Visualization Techniques for Time Navigation of Dynamic Graphs on Large Displays
• Argo Lite: Open-Source Interactive Graph Exploration and Visualization in Browsers
• Accelerating Force-Directed Graph Drawing with RT Cores
• Learning Robust Node Representation on Graphs
• Certified Robustness of Graph Neural Networks against Adversarial Structural Perturbation
• Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More
Survey
• Graph Embedding for Combinatorial Optimization: A Survey
This week ArXiv presents papers on visualization of graphs, robustness certificates, and a survey on combinatorial optimization ♟
GNN
• All About Knowledge Graphs for Actions
• The Effectiveness of Interactive Visualization Techniques for Time Navigation of Dynamic Graphs on Large Displays
• Argo Lite: Open-Source Interactive Graph Exploration and Visualization in Browsers
• Accelerating Force-Directed Graph Drawing with RT Cores
• Learning Robust Node Representation on Graphs
• Certified Robustness of Graph Neural Networks against Adversarial Structural Perturbation
• Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More
Survey
• Graph Embedding for Combinatorial Optimization: A Survey
Topology-Based Papers at ICML 2020
Topological data analysis studies the applications of topological methods to real-world data, for example constructing and studying a proper manifold given only 3D points. This topic is increasingly gaining attention and a new post by Bastian Rieck discusses topological papers at ICML 2020 that includes graph filtration techniques, topological autoencoders, and normalizing flows.
Topological data analysis studies the applications of topological methods to real-world data, for example constructing and studying a proper manifold given only 3D points. This topic is increasingly gaining attention and a new post by Bastian Rieck discusses topological papers at ICML 2020 that includes graph filtration techniques, topological autoencoders, and normalizing flows.
GNN aggregators talk
Today (6 pm Europe time) Petar Veličković will speak about their work on Principal Neighbourhood
Aggregation for Graph Nets. He will discuss how you can design better neighborhood aggregators for your GNNs.
Stream: https://youtube.com/watch?v=c00GuCe62mk
Slides: https://petar-v.com/talks/PNA-AISC.pdf
Today (6 pm Europe time) Petar Veličković will speak about their work on Principal Neighbourhood
Aggregation for Graph Nets. He will discuss how you can design better neighborhood aggregators for your GNNs.
Stream: https://youtube.com/watch?v=c00GuCe62mk
Slides: https://petar-v.com/talks/PNA-AISC.pdf
YouTube
Principal Neighbourhood Aggregation for Graph Nets | AISC
------------------
Join our machine learning product challenge and win 💰cash prizes💰 up to $3,000 : https://ai.science/challenges.
------------------
Speaker(s): Petar Veličković
Facilitator(s): Nabila Abraham
Find the recording, slides, and more info…
Join our machine learning product challenge and win 💰cash prizes💰 up to $3,000 : https://ai.science/challenges.
------------------
Speaker(s): Petar Veličković
Facilitator(s): Nabila Abraham
Find the recording, slides, and more info…
GML Newsletter Issue #2
The second newsletter is out!
Blog posts (graph laplacians, SIGN, quantum GNN, TDA), videos (MLSS-Indo, PNA), events (KDD, Israeli workshops, JuliaCon), books, and upcoming events (graph drawing symposium, data fest).
The second newsletter is out!
Blog posts (graph laplacians, SIGN, quantum GNN, TDA), videos (MLSS-Indo, PNA), events (KDD, Israeli workshops, JuliaCon), books, and upcoming events (graph drawing symposium, data fest).
Graph Convolutional Networks Lecture
A lecture by Xavier Bresson as part of NYU course is now available on YouTube. This covers spectral and spatial architectures as well as benchmarking between those. Additionally you can find practical session and slides on the course webpage.
A lecture by Xavier Bresson as part of NYU course is now available on YouTube. This covers spectral and spatial architectures as well as benchmarking between those. Additionally you can find practical session and slides on the course webpage.
YouTube
Week 13 – Lecture: Graph Convolutional Networks (GCNs)
Course website: https://bit.ly/DLSP20-web
Playlist: https://bit.ly/pDL-YouTube
Speaker: Xavier Bresson
Week 13: https://bit.ly/DLSP20-13
0:00:00 – Week 13 – Lecture
LECTURE Part A
In this section, we discuss the architecture and convolution of traditional…
Playlist: https://bit.ly/pDL-YouTube
Speaker: Xavier Bresson
Week 13: https://bit.ly/DLSP20-13
0:00:00 – Week 13 – Lecture
LECTURE Part A
In this section, we discuss the architecture and convolution of traditional…
DeepMind's Traffic Prediction with Advanced Graph Neural Networks
A new blog post by DeepMind has been released recently that describes how you can apply GNN for travel time predictions. There are not many details about the model itself (which makes me wonder if deep net trained across all supersegments would suffice), but there are curious details about training.
1. As the road network is huge I suppose, they use sampling sampling of subgraphs in proportion to traffic density. This should be similar to GraphSAGE-like approaches.
2. Sampled subgraphs can vary a lot in a single batch. So they use RL to select subgraph properly. I guess it's some form of imitation learning that selects graphs in a batch based on some objective value.
3. They use MetaGradients algorithm to select a learning rate, which was previously used to parametrize returns in RL. I guess it parametrizes learning rate instead in this blog post.
A new blog post by DeepMind has been released recently that describes how you can apply GNN for travel time predictions. There are not many details about the model itself (which makes me wonder if deep net trained across all supersegments would suffice), but there are curious details about training.
1. As the road network is huge I suppose, they use sampling sampling of subgraphs in proportion to traffic density. This should be similar to GraphSAGE-like approaches.
2. Sampled subgraphs can vary a lot in a single batch. So they use RL to select subgraph properly. I guess it's some form of imitation learning that selects graphs in a batch based on some objective value.
3. They use MetaGradients algorithm to select a learning rate, which was previously used to parametrize returns in RL. I guess it parametrizes learning rate instead in this blog post.
Google DeepMind
Traffic prediction with advanced Graph Neural Networks
By partnering with Google, DeepMind is able to bring the benefits of AI to billions of people all over the world. From reuniting a speech-impaired user with his original voice, to helping users disco…