Graph Machine Learning
6.7K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
The paper kind of shows that graphs are not necessary for graph classification. If you represent a graph as just a set of nodes without any information on their adjacency and train MLP model, you can get SOTA results. Important lesson to learn when we make judgments about the quality of the idea/paper based on empirical results.
Forwarded from Sergey Ivanov
Shows a significant boost in IQ-like tests (originally introduced by deepmind https://deepmind.com/blog/article/measuring-abstract-reasoning) if we use graphs to represent diagrams.
Forwarded from Sergey Ivanov
I had very high expectations for this paper, as I also had some convincing studies showing that you don't need graphs in many datasets. Essentially authors decompose an embedding from the neighborhood into signal and noise and show that with lots of noise you don't need any topology. They also propose an aggregation function that decides on how to filter noise from the neighbors. What's interesting is that they say that mean aggregation is better than sum, while in GIN network the aggregation is proved better with sum instead of mean.
Channel name was changed to «Graph Machine Learning»
Channel photo updated
Authors are super-confident about what they say but the conclusions are quite important (if correct). What they show is that you can use structural-based embeddings instead of distance-based embeddings and vice versa as they are equivalent. Structural-based embeddings are used for node classification task and distance-based embeddings are used for link prediction task, but apparently they are not that different.
Paper proposed new embeddings based on (almost) anonymous walks... a few years after of the original paper. Can I resubmit my own papers and get accepted?
The paper proposes GNN for knowledge graph reasoning. But what's really interesting is that AC single-handedly saves this paper from 3 rejects to the accept.
NetLSD: Hearing the Shape of a Graph
Proposing a distance between graphs, essentially as a L2 distance between a more advanced spectrum of a graph.
https://arxiv.org/abs/1805.10712
Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data (ICLR 2019)
The paper proposes a DNN architecture, where each layer is a gradient boosting decision trees (GBDT) such that the outputs of previous layer are passed forward to the new one. A quite interesting contribution is how to make those GBDT layers differentiable for end-to-end training.
https://arxiv.org/pdf/1909.06312.pdf
Computed some stats about graph papers in ICLR 2020. There are a few interesting things.

(1) Every third paper on graphs is accepted, clear indication GML is becoming popular;
(2) On average it's needed [6,6,8] to get accepted, [6,6,6] would be borderline.
(3) AC can sometimes save a paper, even if got low scores. This is rather good, meaning that reviewers are not the only ones who decide.
(4) Likewise, AC can reject a paper, even if it is unanimous accept by the reviewers. I think that happens mostly because the paper does not present enough experimental comparison to SOTA.

https://medium.com/@sergei.ivanov_24894/iclr-2020-graph-papers-9bc2e90e56b0
Recent papers on graph matching.

Scalable Gromov-Wasserstein Learning for Graph Partitioning and Matching (NeurIPS 2019) https://nips.cc/Conferences/2019/Schedule?showEvent=13486

KerGM: Kernelized Graph Matching (NeurIPS 2019)https://nips.cc/Conferences/2019/Schedule?showEvent=14512

(Nearly) Efficient Algorithms for the Graph Matching Problem on Correlated Random Graphs (NeurIPS 2019)https://nips.cc/Conferences/2019/Schedule?showEvent=13959

Gromov-Wasserstein Learning for Graph Matching and Node Embedding (ICML 2019)https://icml.cc/Conferences/2019/Schedule?showEvent=3845

Graph Matching Networks for Learning the Similarity of Graph Structured Objects (ICML 2019)https://deepmind.com/research/publications/Graph-matching-networks-for-learning-the-similarity-of-graph-structured-objects

Learning deep graph matching with channel-independent embedding and Hungarian attention (ICLR 2020) https://openreview.net/forum?id=rJgBd2NYPH

Deep Graph Matching Consensus (ICLR 2020) https://openreview.net/forum?id=HyeJf1HKvS

Spectral Graph Matching and Regularized Quadratic Relaxations II: Erdős-Rényi Graphs and Universality (ICML 2020) https://arxiv.org/abs/1907.08883

Graph Optimal Transport for Cross-Domain Alignment (ICML 2020) https://arxiv.org/abs/2006.14744
Our resubmission of the paper from ICLR to IJCAI. Taught me how to strip down the paper from 21 pages to 6. Also, there are 9K submissions and one of authors for each submission must agree to review three other papers, so I expect a lot of noise, but still hope for the best.