How Uber Eats uses GNNs to power recommendations.
https://eng.uber.com/uber-eats-graph-learning/
https://eng.uber.com/uber-eats-graph-learning/
There is a recent trend in machine learning papers to do ablation studies, showing that SOTA results are not that great compared to old baselines. RecSys 19 best paper was about it (https://arxiv.org/abs/1907.06902). I think I saw some similar works in NLP and CV, and now it's time for GML. Two papers, one in knowledge graph link prediction and another in graph classification:
https://openreview.net/forum?id=BkxSmlBFvr
https://openreview.net/forum?id=HygDF6NFPB
https://openreview.net/forum?id=BkxSmlBFvr
https://openreview.net/forum?id=HygDF6NFPB
OpenReview
You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph...
We study the impact of training strategies on the performance of knowledge graph embeddings.
6 papers at ICLR by the group of Jure Leskovec (3 accepts + 3 rejects)
1. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings (https://openreview.net/forum?id=BJgr4kSFDS)
2. Strategies for Pre-training Graph Neural Networks (https://openreview.net/forum?id=HJlWWJSFDH)
3. Redundancy-Free Computation Graphs for Graph Neural Networks (https://openreview.net/forum?id=H1eF3kStPS)
4. Unifying Graph Convolutional Neural Networks and Label Propagation (https://openreview.net/forum?id=rkgdYhVtvH)
5. Selection via Proxy: Efficient Data Selection for Deep Learning (https://openreview.net/forum?id=HJg2b0VYDr)
6. Coresets for Accelerating Incremental Gradient Methods (https://openreview.net/forum?id=SygRikHtvS)
1. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings (https://openreview.net/forum?id=BJgr4kSFDS)
2. Strategies for Pre-training Graph Neural Networks (https://openreview.net/forum?id=HJlWWJSFDH)
3. Redundancy-Free Computation Graphs for Graph Neural Networks (https://openreview.net/forum?id=H1eF3kStPS)
4. Unifying Graph Convolutional Neural Networks and Label Propagation (https://openreview.net/forum?id=rkgdYhVtvH)
5. Selection via Proxy: Efficient Data Selection for Deep Learning (https://openreview.net/forum?id=HJg2b0VYDr)
6. Coresets for Accelerating Incremental Gradient Methods (https://openreview.net/forum?id=SygRikHtvS)
OpenReview
Query2box: Reasoning over Knowledge Graphs in Vector Space Using...
Answering a wide class of logical queries over knowledge graphs with box embeddings in vector space
Continuing this, a group of Le Song has 7 papers at ICLR, all accepts. This is top-2 result among all, with the first one Sergey Levine having 13 accepts.
1. HOPPITY: Learning Graph Transformations to Detect and Fix Bugs in Programs (https://openreview.net/forum?id=SJeqs6EFvB)
2. GLAD: Learning Sparse Graph Recovery (https://openreview.net/forum?id=BkxpMTEtPB)
3. Efficient Probabilistic Logic Reasoning with Graph Neural Networks (https://openreview.net/forum?id=rJg76kStwH)
4. Double Neural Counterfactual Regret Minimization (https://openreview.net/forum?id=ByedzkrKvH)
5. RNA Secondary Structure Prediction By Learning Unrolled Algorithms (https://openreview.net/forum?id=S1eALyrYDH)
6. Learn to Explain Efficiently via Neural Logic Inductive Learning (https://openreview.net/forum?id=SJlh8CEYDB)
7. Learning to Plan in High Dimensions via Neural Exploration-Exploitation Trees (https://openreview.net/forum?id=rJgJDAVKvB)
1. HOPPITY: Learning Graph Transformations to Detect and Fix Bugs in Programs (https://openreview.net/forum?id=SJeqs6EFvB)
2. GLAD: Learning Sparse Graph Recovery (https://openreview.net/forum?id=BkxpMTEtPB)
3. Efficient Probabilistic Logic Reasoning with Graph Neural Networks (https://openreview.net/forum?id=rJg76kStwH)
4. Double Neural Counterfactual Regret Minimization (https://openreview.net/forum?id=ByedzkrKvH)
5. RNA Secondary Structure Prediction By Learning Unrolled Algorithms (https://openreview.net/forum?id=S1eALyrYDH)
6. Learn to Explain Efficiently via Neural Logic Inductive Learning (https://openreview.net/forum?id=SJlh8CEYDB)
7. Learning to Plan in High Dimensions via Neural Exploration-Exploitation Trees (https://openreview.net/forum?id=rJgJDAVKvB)
OpenReview
HOPPITY: LEARNING GRAPH TRANSFORMATIONS TO DETECT AND FIX BUGS IN...
An learning-based approach for detecting and fixing bugs in Javascript
There are quite a few tools to monitor new papers on ArXiv:
* arxiv-sanity.com
* arxivist.com
But you can also configure rss feed on the keywords that you like by using https://siftrss.com/
For example, if you want papers only on graphs you can use the following links:
CS track: https://siftrss.com/f/x70NM5NWmLn
Stat track: https://siftrss.com/f/3meBo55VMyA
* arxiv-sanity.com
* arxivist.com
But you can also configure rss feed on the keywords that you like by using https://siftrss.com/
For example, if you want papers only on graphs you can use the following links:
CS track: https://siftrss.com/f/x70NM5NWmLn
Stat track: https://siftrss.com/f/3meBo55VMyA
Siftrss
siftrss | Filter your RSS feeds and throw away the junk
Add powerful filters to any Atom or RSS feed for free! No registration, no limitations, no ads—just a handy tool to declutter your feeds.
Finally finished the post on the cutting-edge research in Graph Machine Learning. A lot of interesting ideas and applications.
https://towardsdatascience.com/top-trends-of-graph-machine-learning-in-2020-1194175351a3
https://towardsdatascience.com/top-trends-of-graph-machine-learning-in-2020-1194175351a3
Medium
Top Trends of Graph Machine Learning in 2020
The year 2020 has just started but we can already see the trends of Graph Machine Learning (GML) in the latest research papers. Below is…
One of the trends that I outlined in the post above was about the growing rate of papers on knowledge graphs. This is quite interesting as I realize that many recommendation tasks for example in conversational AI systems can be modeled well with knowledge graphs, instead of let's say deep learning methods. Here is a very fresh survey on this topic.
A Survey on Knowledge Graphs: Representation, Acquisition and Applications.
https://arxiv.org/abs/2002.00388
A Survey on Knowledge Graphs: Representation, Acquisition and Applications.
https://arxiv.org/abs/2002.00388
Our new submission for ICLR workshop on AI+Neuroscience: https://baicsworkshop.github.io/
Here is what I think.
1️⃣ Even for simple ideas it takes still 5-7 days to implement and write 4 pages.
2️⃣ More importantly, this work is about prediction of IQ based on the EEG brain measurements. Essentially, here is
It would be cool to have some hints from the oracle that would say "Don't bother, these data are doomed, you can't do better with ML". If you know some theory like that, please ping me in private messages 🙁
Here is what I think.
1️⃣ Even for simple ideas it takes still 5-7 days to implement and write 4 pages.
2️⃣ More importantly, this work is about prediction of IQ based on the EEG brain measurements. Essentially, here is
(X, y) train whatever model you want and report the result. The problem is that for real data sets simple baselines work better than your machine learning. For example, taking the most common y from the training set and predicting it for all examples test gives the result very close to ML. It would be cool to have some hints from the oracle that would say "Don't bother, these data are doomed, you can't do better with ML". If you know some theory like that, please ping me in private messages 🙁
AAAI 2020 is taking place tomorrow in NYC.
AAAI 2020 stats
7737 number of submissions
1591 number of accepted
21% acceptance rate
142 graph accepted papers (9% of total)
ICLR 2020 stats
2213 number of submissions
687 number of accepted
31% acceptance rate
49 graph accepted papers (7% of total)
AAAI 2020 stats
7737 number of submissions
1591 number of accepted
21% acceptance rate
142 graph accepted papers (9% of total)
ICLR 2020 stats
2213 number of submissions
687 number of accepted
31% acceptance rate
49 graph accepted papers (7% of total)
Two tutorials on GML at AAAI 2020.
Graph Neural Networks: Models and Applications https://aaai.org/Conferences/AAAI-20/aaai20tutorials/#fa4
Differential Deep Learning on Graphs and its Applications https://aaai.org/Conferences/AAAI-20/aaai20tutorials/#fp1
Graph Neural Networks: Models and Applications https://aaai.org/Conferences/AAAI-20/aaai20tutorials/#fa4
Differential Deep Learning on Graphs and its Applications https://aaai.org/Conferences/AAAI-20/aaai20tutorials/#fp1
KDD deadline is coming next week and it is one of the most popular places to submit strong GML paper, even though it is a general data mining conference, with all sorts of papers in computer science.
4 out of 5 last years, the best papers were assigned to graph research.
2019 Network Density of States
https://www.kdd.org/kdd2019/accepted-papers/view/network-density-of-states
2018 Adversarial Attacks on Neural Networks for Graph Data
https://www.kdd.org/kdd2018/accepted-papers/view/adversarial-attacks-on-neural-networks-for-graph-data
2016 FRAUDAR: Bounding Graph Fraud in the Face of Camouflage
https://www.kdd.org/kdd2016/subtopic/view/fraudar-bounding-graph-fraud-in-the-face-of-camouflage
2015 Efficient Algorithms for Public-Private Social Networks
https://ai.googleblog.com/2015/08/kdd-2015-best-research-paper-award.html
4 out of 5 last years, the best papers were assigned to graph research.
2019 Network Density of States
https://www.kdd.org/kdd2019/accepted-papers/view/network-density-of-states
2018 Adversarial Attacks on Neural Networks for Graph Data
https://www.kdd.org/kdd2018/accepted-papers/view/adversarial-attacks-on-neural-networks-for-graph-data
2016 FRAUDAR: Bounding Graph Fraud in the Face of Camouflage
https://www.kdd.org/kdd2016/subtopic/view/fraudar-bounding-graph-fraud-in-the-face-of-camouflage
2015 Efficient Algorithms for Public-Private Social Networks
https://ai.googleblog.com/2015/08/kdd-2015-best-research-paper-award.html
Tutorial slides from AAAI 20.
Graph Neural Networks: Models and Applications
https://cse.msu.edu/~mayao4/tutorials/aaai2020/
Differential Deep Learning on Graphs and its Applications
https://www.calvinzang.com/DDLG_AAAI_2020.html
Graph Neural Networks: Models and Applications
https://cse.msu.edu/~mayao4/tutorials/aaai2020/
Differential Deep Learning on Graphs and its Applications
https://www.calvinzang.com/DDLG_AAAI_2020.html
Workshop materials from AAAI 20.
Workshop on Deep Learning on Graphs: Methodologies and Applications (DLGMA’20)
https://dlg2019.bitbucket.io/aaai20/
Workshop on Deep Learning on Graphs: Methodologies and Applications (DLGMA’20)
https://dlg2019.bitbucket.io/aaai20/
There are two big libraries to build and use GNN: Deep Graph Library (DGL) and PyTorch-Geometric (PTG).
I personally used only the latter, because it's been more popular, but it seems DGL is catching up.
* DGL is written for PyTorch, but TF is on its way.
* DGL 4K github stars vs PTG 6.5K.
* DGL has more support from academia and industry (e.g. available on AWS).
* DGL is faster (at least in their presentations).
There is a nice workshop video from NeurIPS 19 on DGL: https://slideslive.com/38921873/graph-representation-learning-4
There are also overlapping workshop slides from AAAI 20:
https://dlg2019.bitbucket.io/aaai20/keynote_slides/George-dgl-aaai2020.pdf
I personally used only the latter, because it's been more popular, but it seems DGL is catching up.
* DGL is written for PyTorch, but TF is on its way.
* DGL 4K github stars vs PTG 6.5K.
* DGL has more support from academia and industry (e.g. available on AWS).
* DGL is faster (at least in their presentations).
There is a nice workshop video from NeurIPS 19 on DGL: https://slideslive.com/38921873/graph-representation-learning-4
There are also overlapping workshop slides from AAAI 20:
https://dlg2019.bitbucket.io/aaai20/keynote_slides/George-dgl-aaai2020.pdf
SlidesLive
Bistra Dilkina · Graph Representation Learning for Optimization on Graphs
Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning...
Fresh picks from ArXiv
ICML 20 submissions, AISTATS 20, graphs in math, and Stephen Hawking 👨🔬
ICML 2020 submissions
Fast Detection of Maximum Common Subgraph via Deep Q-Learning (https://arxiv.org/abs/2002.03129)
Random Features Strengthen Graph Neural Networks (https://arxiv.org/abs/2002.03155)
Hierarchical Generation of Molecular Graphs using Structural Motifs (https://arxiv.org/pdf/2002.03230.pdf)
Graph Neural Distance Metric Learning with Graph-Bert (https://arxiv.org/abs/2002.03427)
Segmented Graph-Bert for Graph Instance Modeling (https://arxiv.org/abs/2002.03283)
Haar Graph Pooling (https://arxiv.org/abs/1909.11580)
Constant Time Graph Neural Networks (https://arxiv.org/abs/1901.07868)
AISTATS 20
Laplacian-Regularized Graph Bandits: Algorithms and Theoretical Analysis (https://arxiv.org/abs/1907.05632)
Math
Some arithmetical problems that are obtained by analyzing proofs and infinite graphs (https://arxiv.org/abs/2002.03075)
Extra pearls in graph theory (https://arxiv.org/abs/1812.06627)
Distance Metric Learning for Graph Structured Data (https://arxiv.org/abs/2002.00727)
Surveys
Generalized metric spaces. Relations with graphs, ordered sets and automata : A survey (https://arxiv.org/abs/2002.03019)
Stephen Hawking 👨🔬
Stephen William Hawking: A Biographical Memoir (https://arxiv.org/abs/2002.03185)
ICML 20 submissions, AISTATS 20, graphs in math, and Stephen Hawking 👨🔬
ICML 2020 submissions
Fast Detection of Maximum Common Subgraph via Deep Q-Learning (https://arxiv.org/abs/2002.03129)
Random Features Strengthen Graph Neural Networks (https://arxiv.org/abs/2002.03155)
Hierarchical Generation of Molecular Graphs using Structural Motifs (https://arxiv.org/pdf/2002.03230.pdf)
Graph Neural Distance Metric Learning with Graph-Bert (https://arxiv.org/abs/2002.03427)
Segmented Graph-Bert for Graph Instance Modeling (https://arxiv.org/abs/2002.03283)
Haar Graph Pooling (https://arxiv.org/abs/1909.11580)
Constant Time Graph Neural Networks (https://arxiv.org/abs/1901.07868)
AISTATS 20
Laplacian-Regularized Graph Bandits: Algorithms and Theoretical Analysis (https://arxiv.org/abs/1907.05632)
Math
Some arithmetical problems that are obtained by analyzing proofs and infinite graphs (https://arxiv.org/abs/2002.03075)
Extra pearls in graph theory (https://arxiv.org/abs/1812.06627)
Distance Metric Learning for Graph Structured Data (https://arxiv.org/abs/2002.00727)
Surveys
Generalized metric spaces. Relations with graphs, ordered sets and automata : A survey (https://arxiv.org/abs/2002.03019)
Stephen Hawking 👨🔬
Stephen William Hawking: A Biographical Memoir (https://arxiv.org/abs/2002.03185)
Manually-curated list of GML papers in top AI conferences 📚
https://github.com/naganandy/graph-based-deep-learning-literature
https://github.com/naganandy/graph-based-deep-learning-literature
GitHub
GitHub - naganandy/graph-based-deep-learning-literature: links to conference publications in graph-based deep learning
links to conference publications in graph-based deep learning - naganandy/graph-based-deep-learning-literature
Combinatorial Optimization + ML
How can you solve traveling salesman problem (TSP) with ML? One way is to train the agent to make decisions about the next step. This requires that you either imitate already existing solutions or obtain the reward and then update the policy. This works if you have a solver to the problem which can generate solutions or if the problem is easy enough to converge to optimal value fast (e.g. Euclidean TSP).
For harder problems, you can integrate ML inside the solver (which has exponential runtime in the worst-case). So your solver still guarantees the optimality of the solutions but heuristic choices, which exist in most solvers, are done by ML. This is what Exact Combinatorial Optimization with Graph Convolutional Neural Networks (https://arxiv.org/abs/1906.01629) proposes for Branch & Bound procedure, which heuristically chooses the next node for branching. Results are quite impressive, showing that you can decrease the running time of SOTA solvers while preserving optimality, even if the branching choice of ML model does not have guarantees.
How can you solve traveling salesman problem (TSP) with ML? One way is to train the agent to make decisions about the next step. This requires that you either imitate already existing solutions or obtain the reward and then update the policy. This works if you have a solver to the problem which can generate solutions or if the problem is easy enough to converge to optimal value fast (e.g. Euclidean TSP).
For harder problems, you can integrate ML inside the solver (which has exponential runtime in the worst-case). So your solver still guarantees the optimality of the solutions but heuristic choices, which exist in most solvers, are done by ML. This is what Exact Combinatorial Optimization with Graph Convolutional Neural Networks (https://arxiv.org/abs/1906.01629) proposes for Branch & Bound procedure, which heuristically chooses the next node for branching. Results are quite impressive, showing that you can decrease the running time of SOTA solvers while preserving optimality, even if the branching choice of ML model does not have guarantees.