Authors are super-confident about what they say but the conclusions are quite important (if correct). What they show is that you can use structural-based embeddings instead of distance-based embeddings and vice versa as they are equivalent. Structural-based embeddings are used for node classification task and distance-based embeddings are used for link prediction task, but apparently they are not that different.
Paper proposed new embeddings based on (almost) anonymous walks... a few years after of the original paper. Can I resubmit my own papers and get accepted?
The paper proposes GNN for knowledge graph reasoning. But what's really interesting is that AC single-handedly saves this paper from 3 rejects to the accept.
NetLSD: Hearing the Shape of a Graph
Proposing a distance between graphs, essentially as a L2 distance between a more advanced spectrum of a graph.
https://arxiv.org/abs/1805.10712
Proposing a distance between graphs, essentially as a L2 distance between a more advanced spectrum of a graph.
https://arxiv.org/abs/1805.10712
Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data (ICLR 2019)
The paper proposes a DNN architecture, where each layer is a gradient boosting decision trees (GBDT) such that the outputs of previous layer are passed forward to the new one. A quite interesting contribution is how to make those GBDT layers differentiable for end-to-end training.
https://arxiv.org/pdf/1909.06312.pdf
The paper proposes a DNN architecture, where each layer is a gradient boosting decision trees (GBDT) such that the outputs of previous layer are passed forward to the new one. A quite interesting contribution is how to make those GBDT layers differentiable for end-to-end training.
https://arxiv.org/pdf/1909.06312.pdf
Computed some stats about graph papers in ICLR 2020. There are a few interesting things.
(1) Every third paper on graphs is accepted, clear indication GML is becoming popular;
(2) On average it's needed [6,6,8] to get accepted, [6,6,6] would be borderline.
(3) AC can sometimes save a paper, even if got low scores. This is rather good, meaning that reviewers are not the only ones who decide.
(4) Likewise, AC can reject a paper, even if it is unanimous accept by the reviewers. I think that happens mostly because the paper does not present enough experimental comparison to SOTA.
https://medium.com/@sergei.ivanov_24894/iclr-2020-graph-papers-9bc2e90e56b0
(1) Every third paper on graphs is accepted, clear indication GML is becoming popular;
(2) On average it's needed [6,6,8] to get accepted, [6,6,6] would be borderline.
(3) AC can sometimes save a paper, even if got low scores. This is rather good, meaning that reviewers are not the only ones who decide.
(4) Likewise, AC can reject a paper, even if it is unanimous accept by the reviewers. I think that happens mostly because the paper does not present enough experimental comparison to SOTA.
https://medium.com/@sergei.ivanov_24894/iclr-2020-graph-papers-9bc2e90e56b0
Medium
ICLR 2020 Graph Papers
Below are quick stats for papers in Graph Machine Learning (GML) appeared in ICLR 2020 submissions. The code for analysis is available in…
Recent papers on graph matching.
Scalable Gromov-Wasserstein Learning for Graph Partitioning and Matching (NeurIPS 2019) https://nips.cc/Conferences/2019/Schedule?showEvent=13486
KerGM: Kernelized Graph Matching (NeurIPS 2019)https://nips.cc/Conferences/2019/Schedule?showEvent=14512
(Nearly) Efficient Algorithms for the Graph Matching Problem on Correlated Random Graphs (NeurIPS 2019)https://nips.cc/Conferences/2019/Schedule?showEvent=13959
Gromov-Wasserstein Learning for Graph Matching and Node Embedding (ICML 2019)https://icml.cc/Conferences/2019/Schedule?showEvent=3845
Graph Matching Networks for Learning the Similarity of Graph Structured Objects (ICML 2019)https://deepmind.com/research/publications/Graph-matching-networks-for-learning-the-similarity-of-graph-structured-objects
Learning deep graph matching with channel-independent embedding and Hungarian attention (ICLR 2020) https://openreview.net/forum?id=rJgBd2NYPH
Deep Graph Matching Consensus (ICLR 2020) https://openreview.net/forum?id=HyeJf1HKvS
Spectral Graph Matching and Regularized Quadratic Relaxations II: Erdős-Rényi Graphs and Universality (ICML 2020) https://arxiv.org/abs/1907.08883
Graph Optimal Transport for Cross-Domain Alignment (ICML 2020) https://arxiv.org/abs/2006.14744
Scalable Gromov-Wasserstein Learning for Graph Partitioning and Matching (NeurIPS 2019) https://nips.cc/Conferences/2019/Schedule?showEvent=13486
KerGM: Kernelized Graph Matching (NeurIPS 2019)https://nips.cc/Conferences/2019/Schedule?showEvent=14512
(Nearly) Efficient Algorithms for the Graph Matching Problem on Correlated Random Graphs (NeurIPS 2019)https://nips.cc/Conferences/2019/Schedule?showEvent=13959
Gromov-Wasserstein Learning for Graph Matching and Node Embedding (ICML 2019)https://icml.cc/Conferences/2019/Schedule?showEvent=3845
Graph Matching Networks for Learning the Similarity of Graph Structured Objects (ICML 2019)https://deepmind.com/research/publications/Graph-matching-networks-for-learning-the-similarity-of-graph-structured-objects
Learning deep graph matching with channel-independent embedding and Hungarian attention (ICLR 2020) https://openreview.net/forum?id=rJgBd2NYPH
Deep Graph Matching Consensus (ICLR 2020) https://openreview.net/forum?id=HyeJf1HKvS
Spectral Graph Matching and Regularized Quadratic Relaxations II: Erdős-Rényi Graphs and Universality (ICML 2020) https://arxiv.org/abs/1907.08883
Graph Optimal Transport for Cross-Domain Alignment (ICML 2020) https://arxiv.org/abs/2006.14744
nips.cc
NeurIPS 2019 Schedule
NeurIPS Website
Our resubmission of the paper from ICLR to IJCAI. Taught me how to strip down the paper from 21 pages to 6. Also, there are 9K submissions and one of authors for each submission must agree to review three other papers, so I expect a lot of noise, but still hope for the best.
What a tragedy to the authors :) Got 6,6,6 with quite positive reviews, to see AC rejects the paper without much explanation
https://openreview.net/forum?id=SygcSlHFvS
https://openreview.net/forum?id=SygcSlHFvS
OpenReview
On Understanding Knowledge Graph Representation
Understanding the structure of knowledge graph representation using insight from word embeddings.
How Uber Eats uses GNNs to power recommendations.
https://eng.uber.com/uber-eats-graph-learning/
https://eng.uber.com/uber-eats-graph-learning/
There is a recent trend in machine learning papers to do ablation studies, showing that SOTA results are not that great compared to old baselines. RecSys 19 best paper was about it (https://arxiv.org/abs/1907.06902). I think I saw some similar works in NLP and CV, and now it's time for GML. Two papers, one in knowledge graph link prediction and another in graph classification:
https://openreview.net/forum?id=BkxSmlBFvr
https://openreview.net/forum?id=HygDF6NFPB
https://openreview.net/forum?id=BkxSmlBFvr
https://openreview.net/forum?id=HygDF6NFPB
OpenReview
You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph...
We study the impact of training strategies on the performance of knowledge graph embeddings.
6 papers at ICLR by the group of Jure Leskovec (3 accepts + 3 rejects)
1. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings (https://openreview.net/forum?id=BJgr4kSFDS)
2. Strategies for Pre-training Graph Neural Networks (https://openreview.net/forum?id=HJlWWJSFDH)
3. Redundancy-Free Computation Graphs for Graph Neural Networks (https://openreview.net/forum?id=H1eF3kStPS)
4. Unifying Graph Convolutional Neural Networks and Label Propagation (https://openreview.net/forum?id=rkgdYhVtvH)
5. Selection via Proxy: Efficient Data Selection for Deep Learning (https://openreview.net/forum?id=HJg2b0VYDr)
6. Coresets for Accelerating Incremental Gradient Methods (https://openreview.net/forum?id=SygRikHtvS)
1. Query2box: Reasoning over Knowledge Graphs in Vector Space Using Box Embeddings (https://openreview.net/forum?id=BJgr4kSFDS)
2. Strategies for Pre-training Graph Neural Networks (https://openreview.net/forum?id=HJlWWJSFDH)
3. Redundancy-Free Computation Graphs for Graph Neural Networks (https://openreview.net/forum?id=H1eF3kStPS)
4. Unifying Graph Convolutional Neural Networks and Label Propagation (https://openreview.net/forum?id=rkgdYhVtvH)
5. Selection via Proxy: Efficient Data Selection for Deep Learning (https://openreview.net/forum?id=HJg2b0VYDr)
6. Coresets for Accelerating Incremental Gradient Methods (https://openreview.net/forum?id=SygRikHtvS)
OpenReview
Query2box: Reasoning over Knowledge Graphs in Vector Space Using...
Answering a wide class of logical queries over knowledge graphs with box embeddings in vector space
Continuing this, a group of Le Song has 7 papers at ICLR, all accepts. This is top-2 result among all, with the first one Sergey Levine having 13 accepts.
1. HOPPITY: Learning Graph Transformations to Detect and Fix Bugs in Programs (https://openreview.net/forum?id=SJeqs6EFvB)
2. GLAD: Learning Sparse Graph Recovery (https://openreview.net/forum?id=BkxpMTEtPB)
3. Efficient Probabilistic Logic Reasoning with Graph Neural Networks (https://openreview.net/forum?id=rJg76kStwH)
4. Double Neural Counterfactual Regret Minimization (https://openreview.net/forum?id=ByedzkrKvH)
5. RNA Secondary Structure Prediction By Learning Unrolled Algorithms (https://openreview.net/forum?id=S1eALyrYDH)
6. Learn to Explain Efficiently via Neural Logic Inductive Learning (https://openreview.net/forum?id=SJlh8CEYDB)
7. Learning to Plan in High Dimensions via Neural Exploration-Exploitation Trees (https://openreview.net/forum?id=rJgJDAVKvB)
1. HOPPITY: Learning Graph Transformations to Detect and Fix Bugs in Programs (https://openreview.net/forum?id=SJeqs6EFvB)
2. GLAD: Learning Sparse Graph Recovery (https://openreview.net/forum?id=BkxpMTEtPB)
3. Efficient Probabilistic Logic Reasoning with Graph Neural Networks (https://openreview.net/forum?id=rJg76kStwH)
4. Double Neural Counterfactual Regret Minimization (https://openreview.net/forum?id=ByedzkrKvH)
5. RNA Secondary Structure Prediction By Learning Unrolled Algorithms (https://openreview.net/forum?id=S1eALyrYDH)
6. Learn to Explain Efficiently via Neural Logic Inductive Learning (https://openreview.net/forum?id=SJlh8CEYDB)
7. Learning to Plan in High Dimensions via Neural Exploration-Exploitation Trees (https://openreview.net/forum?id=rJgJDAVKvB)
OpenReview
HOPPITY: LEARNING GRAPH TRANSFORMATIONS TO DETECT AND FIX BUGS IN...
An learning-based approach for detecting and fixing bugs in Javascript
There are quite a few tools to monitor new papers on ArXiv:
* arxiv-sanity.com
* arxivist.com
But you can also configure rss feed on the keywords that you like by using https://siftrss.com/
For example, if you want papers only on graphs you can use the following links:
CS track: https://siftrss.com/f/x70NM5NWmLn
Stat track: https://siftrss.com/f/3meBo55VMyA
* arxiv-sanity.com
* arxivist.com
But you can also configure rss feed on the keywords that you like by using https://siftrss.com/
For example, if you want papers only on graphs you can use the following links:
CS track: https://siftrss.com/f/x70NM5NWmLn
Stat track: https://siftrss.com/f/3meBo55VMyA
Siftrss
siftrss | Filter your RSS feeds and throw away the junk
Add powerful filters to any Atom or RSS feed for free! No registration, no limitations, no ads—just a handy tool to declutter your feeds.
Finally finished the post on the cutting-edge research in Graph Machine Learning. A lot of interesting ideas and applications.
https://towardsdatascience.com/top-trends-of-graph-machine-learning-in-2020-1194175351a3
https://towardsdatascience.com/top-trends-of-graph-machine-learning-in-2020-1194175351a3
Medium
Top Trends of Graph Machine Learning in 2020
The year 2020 has just started but we can already see the trends of Graph Machine Learning (GML) in the latest research papers. Below is…
