Graph Machine Learning Books
For a long time I was thinking that the community lacks proper books on graph machine learning and even thought maybe I should write one. But luckily there are other active people. With the difference of one day 2 (!) books were announced.
Graph Representation Learning Book by Will Hamilton, which so far has 3 main chapters on node embeddings, GNNs, and generative models. While the drafts are ready, there is still a long way to make it comprehensive book and the author promises to work on that. Great start.
Deep Learning on Graphs by Yao Ma and Jiliang Tang. This should be available next month and should focus on foundations of GNNs as well as applications.
That's great, hopefully they will become handbooks for those who want to start in this area. Now waiting the same but for educational courses 🙏
For a long time I was thinking that the community lacks proper books on graph machine learning and even thought maybe I should write one. But luckily there are other active people. With the difference of one day 2 (!) books were announced.
Graph Representation Learning Book by Will Hamilton, which so far has 3 main chapters on node embeddings, GNNs, and generative models. While the drafts are ready, there is still a long way to make it comprehensive book and the author promises to work on that. Great start.
Deep Learning on Graphs by Yao Ma and Jiliang Tang. This should be available next month and should focus on foundations of GNNs as well as applications.
That's great, hopefully they will become handbooks for those who want to start in this area. Now waiting the same but for educational courses 🙏
Mining and Learning with Graphs Workshop
MLG workshop is a regular workshop on various ML solutions for graphs. The videos for each poster can be found here. Keynotes should be available soon (except for Danai Koutra, which is available now).
MLG workshop is a regular workshop on various ML solutions for graphs. The videos for each poster can be found here. Keynotes should be available soon (except for Danai Koutra, which is available now).
Graph Machine Learning research groups: Pietro Liò
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liò, a computational biologist and a supervisor of Petar Veličković. He has also been very active in GML recently (with 54 papers in 2020) so he could be a good choice if you want to do a PhD in this area.
Pietro Liò (~1965)
- Affiliation: University of Cambridge
- Education: Ph.D. in Theoretical Genetics at University of Firenze, Italy in 1995 and Ph.D. in Engineering at University of Pavia, Italy in 2007;
- h-index: 50;
- Awards: Lagrange Fellowship, best papers at ISEM, MCED, FET;
- Interests: graph neural networks, computational biology, signal processing.
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liò, a computational biologist and a supervisor of Petar Veličković. He has also been very active in GML recently (with 54 papers in 2020) so he could be a good choice if you want to do a PhD in this area.
Pietro Liò (~1965)
- Affiliation: University of Cambridge
- Education: Ph.D. in Theoretical Genetics at University of Firenze, Italy in 1995 and Ph.D. in Engineering at University of Pavia, Italy in 2007;
- h-index: 50;
- Awards: Lagrange Fellowship, best papers at ISEM, MCED, FET;
- Interests: graph neural networks, computational biology, signal processing.
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Xavier Bresson
I do a series of posts on the groups in graph research, previous post is here. The 12th is Xavier Bresson, conference and tutorial organizers on graph machine learning.
Xavier Bresson (~1975)
- Affiliation:…
I do a series of posts on the groups in graph research, previous post is here. The 12th is Xavier Bresson, conference and tutorial organizers on graph machine learning.
Xavier Bresson (~1975)
- Affiliation:…
JuliaCon2020 Graph Videos
While Python is a default language for analyzing graphs, there are numerous other languages that provide packages for dealing with graphs. In the recent JuliaCon, devoted to a programming language Julia, many talks were about new graph packages with applications to transportation networks, dynamical systems, geometric deep learning, knowledge graphs, and others. Check out the full program here.
While Python is a default language for analyzing graphs, there are numerous other languages that provide packages for dealing with graphs. In the recent JuliaCon, devoted to a programming language Julia, many talks were about new graph packages with applications to transportation networks, dynamical systems, geometric deep learning, knowledge graphs, and others. Check out the full program here.
julialang.org
The Julia Programming Language
The official website for the Julia Language. Julia is a language that is fast, dynamic, easy to use, and open source. Click here to learn more.
Fresh picks from ArXiv
This week ArXiv presents papers on visualization of graphs, robustness certificates, and a survey on combinatorial optimization ♟
GNN
• All About Knowledge Graphs for Actions
• The Effectiveness of Interactive Visualization Techniques for Time Navigation of Dynamic Graphs on Large Displays
• Argo Lite: Open-Source Interactive Graph Exploration and Visualization in Browsers
• Accelerating Force-Directed Graph Drawing with RT Cores
• Learning Robust Node Representation on Graphs
• Certified Robustness of Graph Neural Networks against Adversarial Structural Perturbation
• Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More
Survey
• Graph Embedding for Combinatorial Optimization: A Survey
This week ArXiv presents papers on visualization of graphs, robustness certificates, and a survey on combinatorial optimization ♟
GNN
• All About Knowledge Graphs for Actions
• The Effectiveness of Interactive Visualization Techniques for Time Navigation of Dynamic Graphs on Large Displays
• Argo Lite: Open-Source Interactive Graph Exploration and Visualization in Browsers
• Accelerating Force-Directed Graph Drawing with RT Cores
• Learning Robust Node Representation on Graphs
• Certified Robustness of Graph Neural Networks against Adversarial Structural Perturbation
• Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More
Survey
• Graph Embedding for Combinatorial Optimization: A Survey
Topology-Based Papers at ICML 2020
Topological data analysis studies the applications of topological methods to real-world data, for example constructing and studying a proper manifold given only 3D points. This topic is increasingly gaining attention and a new post by Bastian Rieck discusses topological papers at ICML 2020 that includes graph filtration techniques, topological autoencoders, and normalizing flows.
Topological data analysis studies the applications of topological methods to real-world data, for example constructing and studying a proper manifold given only 3D points. This topic is increasingly gaining attention and a new post by Bastian Rieck discusses topological papers at ICML 2020 that includes graph filtration techniques, topological autoencoders, and normalizing flows.
GNN aggregators talk
Today (6 pm Europe time) Petar Veličković will speak about their work on Principal Neighbourhood
Aggregation for Graph Nets. He will discuss how you can design better neighborhood aggregators for your GNNs.
Stream: https://youtube.com/watch?v=c00GuCe62mk
Slides: https://petar-v.com/talks/PNA-AISC.pdf
Today (6 pm Europe time) Petar Veličković will speak about their work on Principal Neighbourhood
Aggregation for Graph Nets. He will discuss how you can design better neighborhood aggregators for your GNNs.
Stream: https://youtube.com/watch?v=c00GuCe62mk
Slides: https://petar-v.com/talks/PNA-AISC.pdf
YouTube
Principal Neighbourhood Aggregation for Graph Nets | AISC
------------------
Join our machine learning product challenge and win 💰cash prizes💰 up to $3,000 : https://ai.science/challenges.
------------------
Speaker(s): Petar Veličković
Facilitator(s): Nabila Abraham
Find the recording, slides, and more info…
Join our machine learning product challenge and win 💰cash prizes💰 up to $3,000 : https://ai.science/challenges.
------------------
Speaker(s): Petar Veličković
Facilitator(s): Nabila Abraham
Find the recording, slides, and more info…
GML Newsletter Issue #2
The second newsletter is out!
Blog posts (graph laplacians, SIGN, quantum GNN, TDA), videos (MLSS-Indo, PNA), events (KDD, Israeli workshops, JuliaCon), books, and upcoming events (graph drawing symposium, data fest).
The second newsletter is out!
Blog posts (graph laplacians, SIGN, quantum GNN, TDA), videos (MLSS-Indo, PNA), events (KDD, Israeli workshops, JuliaCon), books, and upcoming events (graph drawing symposium, data fest).
Graph Convolutional Networks Lecture
A lecture by Xavier Bresson as part of NYU course is now available on YouTube. This covers spectral and spatial architectures as well as benchmarking between those. Additionally you can find practical session and slides on the course webpage.
A lecture by Xavier Bresson as part of NYU course is now available on YouTube. This covers spectral and spatial architectures as well as benchmarking between those. Additionally you can find practical session and slides on the course webpage.
YouTube
Week 13 – Lecture: Graph Convolutional Networks (GCNs)
Course website: https://bit.ly/DLSP20-web
Playlist: https://bit.ly/pDL-YouTube
Speaker: Xavier Bresson
Week 13: https://bit.ly/DLSP20-13
0:00:00 – Week 13 – Lecture
LECTURE Part A
In this section, we discuss the architecture and convolution of traditional…
Playlist: https://bit.ly/pDL-YouTube
Speaker: Xavier Bresson
Week 13: https://bit.ly/DLSP20-13
0:00:00 – Week 13 – Lecture
LECTURE Part A
In this section, we discuss the architecture and convolution of traditional…
DeepMind's Traffic Prediction with Advanced Graph Neural Networks
A new blog post by DeepMind has been released recently that describes how you can apply GNN for travel time predictions. There are not many details about the model itself (which makes me wonder if deep net trained across all supersegments would suffice), but there are curious details about training.
1. As the road network is huge I suppose, they use sampling sampling of subgraphs in proportion to traffic density. This should be similar to GraphSAGE-like approaches.
2. Sampled subgraphs can vary a lot in a single batch. So they use RL to select subgraph properly. I guess it's some form of imitation learning that selects graphs in a batch based on some objective value.
3. They use MetaGradients algorithm to select a learning rate, which was previously used to parametrize returns in RL. I guess it parametrizes learning rate instead in this blog post.
A new blog post by DeepMind has been released recently that describes how you can apply GNN for travel time predictions. There are not many details about the model itself (which makes me wonder if deep net trained across all supersegments would suffice), but there are curious details about training.
1. As the road network is huge I suppose, they use sampling sampling of subgraphs in proportion to traffic density. This should be similar to GraphSAGE-like approaches.
2. Sampled subgraphs can vary a lot in a single batch. So they use RL to select subgraph properly. I guess it's some form of imitation learning that selects graphs in a batch based on some objective value.
3. They use MetaGradients algorithm to select a learning rate, which was previously used to parametrize returns in RL. I guess it parametrizes learning rate instead in this blog post.
Google DeepMind
Traffic prediction with advanced Graph Neural Networks
By partnering with Google, DeepMind is able to bring the benefits of AI to billions of people all over the world. From reuniting a speech-impaired user with his original voice, to helping users disco…
Fresh picks from ArXiv
This week on ArXiV is a new library for KG embeddings, a version of batch norm for graphs, and a survey on SVD decompositions 🎙
GNN
• TorchKGE: Knowledge Graph Embedding in Python and PyTorch
• Heterogeneous Graph Neural Network for Recommendation
• FairGNN: Eliminating the Discrimination in Graph Neural Networks with Limited Sensitive Attribute Information
Scaling
• GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training
• Rethinking Graph Regularization For Graph Neural Networks
• Lifelong Graph Learning
Survey
• A Survey of Singular Value Decomposition Methods for Distributed Tall/Skinny Data
This week on ArXiV is a new library for KG embeddings, a version of batch norm for graphs, and a survey on SVD decompositions 🎙
GNN
• TorchKGE: Knowledge Graph Embedding in Python and PyTorch
• Heterogeneous Graph Neural Network for Recommendation
• FairGNN: Eliminating the Discrimination in Graph Neural Networks with Limited Sensitive Attribute Information
Scaling
• GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training
• Rethinking Graph Regularization For Graph Neural Networks
• Lifelong Graph Learning
Survey
• A Survey of Singular Value Decomposition Methods for Distributed Tall/Skinny Data
Graph ML at Twitter
A post by Michael Bronstein and Zehan Wang that talks about the current challenges of using graph models for industry settings: scalability, heterogeneous settings, dynamic graphs, and presence of noise.
A post by Michael Bronstein and Zehan Wang that talks about the current challenges of using graph models for industry settings: scalability, heterogeneous settings, dynamic graphs, and presence of noise.
Twitter
Graph ML at Twitter
How we do Graph Machine Learning at Twitter
On the evaluation of graph neural networks
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed and real results still exist and especially noticeable when the baselines are not properly selected.
For one using MLP only on node features often leads to better results than those from GNNs. This is surprising as GNNs can be seen as a generalization of MLP. I encounter this more and more on new data sets, although for several data sets (e.g. Cora) you can clearly see advantage of using GNNs.
Another ML model that I haven't seen being tried at graph settings is GBDT model (e.g. XGBoost, CatBoost, LightGBM). GBDT model are de-facto winners of many Kaggle competitions where the data is tabular, so you could expect if you have enough variability in your node features just using GBDT on them would often make a good baseline. I have tried this for several problems and it often outperforms the proposed method in the paper. For example, for node classification using GBDT on Bus data set achieves 100% accuracy (vs. ~80% in the paper). Or on graph classification GBDT can beat other top GNN models (see image below). Considering how easy it is to run experiments with GBDT models I would expect it would be a good counterpart to MLP in the realm of baselines.
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed and real results still exist and especially noticeable when the baselines are not properly selected.
For one using MLP only on node features often leads to better results than those from GNNs. This is surprising as GNNs can be seen as a generalization of MLP. I encounter this more and more on new data sets, although for several data sets (e.g. Cora) you can clearly see advantage of using GNNs.
Another ML model that I haven't seen being tried at graph settings is GBDT model (e.g. XGBoost, CatBoost, LightGBM). GBDT model are de-facto winners of many Kaggle competitions where the data is tabular, so you could expect if you have enough variability in your node features just using GBDT on them would often make a good baseline. I have tried this for several problems and it often outperforms the proposed method in the paper. For example, for node classification using GBDT on Bus data set achieves 100% accuracy (vs. ~80% in the paper). Or on graph classification GBDT can beat other top GNN models (see image below). Considering how easy it is to run experiments with GBDT models I would expect it would be a good counterpart to MLP in the realm of baselines.
Graph Machine Learning research groups: Danai Koutra
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michigan, and could be a great Ph.D. advisor if you are interested in GML.
Danai Koutra (~1988)
- Affiliation: University of Michigan
- Education: Ph.D. in Carnegie Mellon University in 2010 (advisor: Christos Faloutsos)
- h-index 25
- Awards: ACM SIGKDD 2016 Dissertation Award; best paper awards at ICDM, PAKDD, ICDT
- Interests: graph mining, knowledge graphs, graph embeddings
I do a series of posts on the groups in graph research, previous post is here. The 14th is Danai Koutra, ex-PhD student of Christos Faloutsos, she leads the graph exploration lab at University of Michigan, and could be a great Ph.D. advisor if you are interested in GML.
Danai Koutra (~1988)
- Affiliation: University of Michigan
- Education: Ph.D. in Carnegie Mellon University in 2010 (advisor: Christos Faloutsos)
- h-index 25
- Awards: ACM SIGKDD 2016 Dissertation Award; best paper awards at ICDM, PAKDD, ICDT
- Interests: graph mining, knowledge graphs, graph embeddings
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Pietro Liò
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liò, a computational biologist and a supervisor of Petar Veličković. He has also been very active in GML…
I do a series of posts on the groups in graph research, previous post is here. The 13th is Pietro Liò, a computational biologist and a supervisor of Petar Veličković. He has also been very active in GML…
Latent graph neural networks: Manifold learning 2.0?
One of the hot topics of this year is construction of a graph from unstructured data (e.g. 3d points or images). In a new post Michael Bronstein discusses existing approaches to latent graph learning and suggests that using GNN both to learn the structure of the graph and to solve the downstream tasks can be a better alternative than a de-coupled approach. This is indeed an exciting and active area of research with open problems and known applications to NLP, physics, and biology.
One of the hot topics of this year is construction of a graph from unstructured data (e.g. 3d points or images). In a new post Michael Bronstein discusses existing approaches to latent graph learning and suggests that using GNN both to learn the structure of the graph and to solve the downstream tasks can be a better alternative than a de-coupled approach. This is indeed an exciting and active area of research with open problems and known applications to NLP, physics, and biology.
Medium
Latent graph neural networks: Manifold learning 2.0?
Can we use graph neural networks when the graph is unknown?
Fresh picks from ArXiv
This week on ArXiv: robot planning in the presence of many objects by researchers at MIT, a new SOTA on probabilistic type inference for software, application of GNN to trying clothes to different body shapes 👚
Applications
- Planning with Learned Object Importance in Large Problem Instances using Graph Neural Networks with Joshua Tenenbaum
- Advanced Graph-Based Deep Learning for Probabilistic Type Inference
- GINet: Graph Interaction Network for Scene Parsing ECCV 20
- Fully Convolutional Graph Neural Networks for Parametric Virtual Try-On SIGGRAPH 2020
- Addressing Cold Start in Recommender Systems with Hierarchical Graph Neural Networks
Theory
- Learning an Interpretable Graph Structure in Multi-Task Learning
- Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
- Adversarial Attack on Large Scale Graph
This week on ArXiv: robot planning in the presence of many objects by researchers at MIT, a new SOTA on probabilistic type inference for software, application of GNN to trying clothes to different body shapes 👚
Applications
- Planning with Learned Object Importance in Large Problem Instances using Graph Neural Networks with Joshua Tenenbaum
- Advanced Graph-Based Deep Learning for Probabilistic Type Inference
- GINet: Graph Interaction Network for Scene Parsing ECCV 20
- Fully Convolutional Graph Neural Networks for Parametric Virtual Try-On SIGGRAPH 2020
- Addressing Cold Start in Recommender Systems with Hierarchical Graph Neural Networks
Theory
- Learning an Interpretable Graph Structure in Multi-Task Learning
- Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
- Adversarial Attack on Large Scale Graph
Graph ML at Data Fest 2020
This year, together with @IggiSv9t, I organize a track at Data Fest 2020. It's like a workshop at the conference, but more informal. We will have videos from our amazing speakers and also networking, where you can speak to me, @IggiSv9t, speakers, or other people who are interested in graph machine learning. Besides our track there will be many other interesting tracks on all aspects of ML and DS (interpretability, antifraud, ML in healthcare, and 40 more tracks!).
It will be this weekend, 19-20 September. You need to be registered (for free) at https://fest.ai/2020/.
Our videos:
Day 1 (Saturday)
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany
Day 2 (Sunday)
1. Large Graph Visualization Tools and Approaches, Sviatoslav Kovalev, Samokat, Russia
2. Business Transformation as Graph Problems, Vadim Safronov, Key Points, Portugal
3. Scene Graph Generation from Images, Boris Knyazev, University of Guelph & Vector Institute, Canada
4. AutoGraph: Graphs Meet AutoML, Denis Vorotinsev, Oura, Finland
5. Link Prediction with Graph Neural Networks, Maxim Panov, Skoltech, Russia
See you there!
This year, together with @IggiSv9t, I organize a track at Data Fest 2020. It's like a workshop at the conference, but more informal. We will have videos from our amazing speakers and also networking, where you can speak to me, @IggiSv9t, speakers, or other people who are interested in graph machine learning. Besides our track there will be many other interesting tracks on all aspects of ML and DS (interpretability, antifraud, ML in healthcare, and 40 more tracks!).
It will be this weekend, 19-20 September. You need to be registered (for free) at https://fest.ai/2020/.
Our videos:
Day 1 (Saturday)
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany
Day 2 (Sunday)
1. Large Graph Visualization Tools and Approaches, Sviatoslav Kovalev, Samokat, Russia
2. Business Transformation as Graph Problems, Vadim Safronov, Key Points, Portugal
3. Scene Graph Generation from Images, Boris Knyazev, University of Guelph & Vector Institute, Canada
4. AutoGraph: Graphs Meet AutoML, Denis Vorotinsev, Oura, Finland
5. Link Prediction with Graph Neural Networks, Maxim Panov, Skoltech, Russia
See you there!
fest.ai
September 19 and 20, Data Fest Online 2020
Main Data Science and Machine Learning community-organized conference in 2020
On Cora dataset
Cora, Citeseer, and Pubmed are three popular data sets for node classification. It's one of those cases where you can clearly see the power of GNN. For example, on Cora GNNs have around 80% accuracy, while GBDT/MLP have only around 60%. This is not often the case: for many data sets I can see marginal win for GNN compared to non-graph methods and for some data sets it's actually lower.
So why the performance of GNN is so great on this data set? I don't have a good answer for this, but here are some thoughts. Cora is a citation network, where nodes are papers and classes are papers' field. However, it's not clear what are the links between this documents. The original paper didn't describe how exactly links are established. If links were based on citation, i.e. two papers are connected if they have a citation from one to another, then it could explain such big improvement of GNN: GNN explore all nodes during training, while MLP only training nodes and since two papers likely to share the same field, GNN leverage this graph information. If that's the case simple k-nn majority vote baseline would be performing similar to GNN. However, there is an opinion from people who know the authors of the original paper saying that the links are established based on word similarity between documents. If that's true, I'm not sure why GNN is doing so well for this data set. In all cases, establishing the graphs from real-world data is something that requires a lot of attention and visibility, that's why structure learning is such an active topic.
Cora, Citeseer, and Pubmed are three popular data sets for node classification. It's one of those cases where you can clearly see the power of GNN. For example, on Cora GNNs have around 80% accuracy, while GBDT/MLP have only around 60%. This is not often the case: for many data sets I can see marginal win for GNN compared to non-graph methods and for some data sets it's actually lower.
So why the performance of GNN is so great on this data set? I don't have a good answer for this, but here are some thoughts. Cora is a citation network, where nodes are papers and classes are papers' field. However, it's not clear what are the links between this documents. The original paper didn't describe how exactly links are established. If links were based on citation, i.e. two papers are connected if they have a citation from one to another, then it could explain such big improvement of GNN: GNN explore all nodes during training, while MLP only training nodes and since two papers likely to share the same field, GNN leverage this graph information. If that's the case simple k-nn majority vote baseline would be performing similar to GNN. However, there is an opinion from people who know the authors of the original paper saying that the links are established based on word similarity between documents. If that's true, I'm not sure why GNN is doing so well for this data set. In all cases, establishing the graphs from real-world data is something that requires a lot of attention and visibility, that's why structure learning is such an active topic.
Telegram
Graph Machine Learning
On the evaluation of graph neural networks
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed and…
Over the last year there have been many revealing benchmark papers that re-evaluate existing GNNs on standard tasks such as node classification (see this and this for example). However, the gap between claimed and…
Graph ML at Data Fest 2020
Day 1 was a pleasant surprise: people with different background came, watched videos, and asked questions. Here are 5 videos of day 1:
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France (where I broadly talk about what is GML, what are the best resources, what's the community, etc.);
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia (where she spoke about her k-NN on graphs, HNSW, theory and her ICML 20 work);
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia (where he spoke about graphical models, treewidth, tensor decomposition);
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany (where he spoke about all popular node embeddings methods and what their pros and cons);
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany (it's all you need to know about knowledge graphs if you don't know what they are).
On day 2, tomorrow, we will have 5 more videos, which would be about applications of graphs.
Please, join us tomorrow at https://spatial.chat/s/ods at 12pm (Moscow time).
Day 1 was a pleasant surprise: people with different background came, watched videos, and asked questions. Here are 5 videos of day 1:
1. Opening remarks: Graph Machine Learning, Sergey Ivanov, Criteo, France (where I broadly talk about what is GML, what are the best resources, what's the community, etc.);
2. Graph-Based Nearest Neighbor Search: Practice and Theory, Liudmila Prokhorenkova, Yandex, Russia (where she spoke about her k-NN on graphs, HNSW, theory and her ICML 20 work);
3. Graphical Models for Tensor Networks and Machine Learning, Roman Schutski, Skoltech, Russia (where he spoke about graphical models, treewidth, tensor decomposition);
4. Unsupervised Graph Representations, Anton Tsistulin, University of Bonn & Google, Germany (where he spoke about all popular node embeddings methods and what their pros and cons);
5. Placing Knowledge Graphs in Graph ML, Michael Galkin, TU Dresden, Germany (it's all you need to know about knowledge graphs if you don't know what they are).
On day 2, tomorrow, we will have 5 more videos, which would be about applications of graphs.
Please, join us tomorrow at https://spatial.chat/s/ods at 12pm (Moscow time).
YouTube
Sergey Ivanov: Graph Machine Learning
Data Fest Online 2020
Graph ML track: https://ods.ai/tracks/graph-ml-df2020
Speaker: Sergey Ivanov (Criteo)
Opening remarks: Graph Machine Learning
Graph Machine Learning is the science between graph theory and machine learning. It is now a very active…
Graph ML track: https://ods.ai/tracks/graph-ml-df2020
Speaker: Sergey Ivanov (Criteo)
Opening remarks: Graph Machine Learning
Graph Machine Learning is the science between graph theory and machine learning. It is now a very active…