ICLR 2020, Day 4
The final day of ICLR 2020. I promise. You can unmute this channel now.
1. What graph neural networks cannot learn: depth vs width portal link
2. The Logical Expressiveness of Graph Neural Networks portal link
3. Reinforced Genetic Algorithm Learning for Optimizing Computation Graphs portal link
4. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations portal link
5. Contrastive Learning of Structured World Models portal link
6. GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding portal link
7. An Inductive Bias for Distances: Neural Nets that Respect the Triangle Inequality portal link
8. Learning deep graph matching with channel-independent embedding and Hungarian attention portal link
9. On the Equivalence between Positional Node Embeddings and Structural Graph Representations portal link
The final day of ICLR 2020. I promise. You can unmute this channel now.
1. What graph neural networks cannot learn: depth vs width portal link
2. The Logical Expressiveness of Graph Neural Networks portal link
3. Reinforced Genetic Algorithm Learning for Optimizing Computation Graphs portal link
4. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations portal link
5. Contrastive Learning of Structured World Models portal link
6. GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding portal link
7. An Inductive Bias for Distances: Neural Nets that Respect the Triangle Inequality portal link
8. Learning deep graph matching with channel-independent embedding and Hungarian attention portal link
9. On the Equivalence between Positional Node Embeddings and Structural Graph Representations portal link
iclr.cc
Login
ICLR Website
Thoughts from the first virtual conference
I had nice experience from virtual ICLR 2020. Most of the poster sessions were empty, which allowed me to bother authors with questions. Each paper had two slots during the day, so that I can definitely attend it. Chat allowed finding attendees quite easily, something that I had difficulty with real conferences. So it was much more valuable based on the insights that I gained than in real conference. But I didn't present and can understand that other people didn't get what they wanted.
By the way organizers, promised to make the portal available to everyone soon.
Now, here are some insights from the papers that I gained.
1) Topic on theoretical explanation of GNNs is hot. We now know some problems that can be approximated with GNN, functions that GNN can compute, limitations of GNN. [paper 1, paper 2, paper 3, paper 4]
2) One emerging topic is to teach GNN to learn algorithms, instead of doing classification task. Here be dragons. [paper 1, paper 2]
3) GNN are used to represent programs and equations. So potentially you can prove theorems with it. [paper 1, paper 2, paper 3, paper 4, paper 5]
I had nice experience from virtual ICLR 2020. Most of the poster sessions were empty, which allowed me to bother authors with questions. Each paper had two slots during the day, so that I can definitely attend it. Chat allowed finding attendees quite easily, something that I had difficulty with real conferences. So it was much more valuable based on the insights that I gained than in real conference. But I didn't present and can understand that other people didn't get what they wanted.
By the way organizers, promised to make the portal available to everyone soon.
Now, here are some insights from the papers that I gained.
1) Topic on theoretical explanation of GNNs is hot. We now know some problems that can be approximated with GNN, functions that GNN can compute, limitations of GNN. [paper 1, paper 2, paper 3, paper 4]
2) One emerging topic is to teach GNN to learn algorithms, instead of doing classification task. Here be dragons. [paper 1, paper 2]
3) GNN are used to represent programs and equations. So potentially you can prove theorems with it. [paper 1, paper 2, paper 3, paper 4, paper 5]
OpenReview
What graph neural networks cannot learn: depth vs width
Several graph problems are impossible unless the product of a graph neural network's depth and width exceeds a polynomial of the graph size.
Videos from Geometric and Relational Deep Learning Workshop
Videos are available from the workshop. Two my favorites are:
* Peter Battaglia: Learning Physics with Graph Neural Networks [video]
* Yaron Lipman: Deep Learning of Irregular and Geometric Data [video]
Videos are available from the workshop. Two my favorites are:
* Peter Battaglia: Learning Physics with Graph Neural Networks [video]
* Yaron Lipman: Deep Learning of Irregular and Geometric Data [video]
YouTube
Peter Battaglia - Learning Physics with Graph Neural Networks
ELLIS Workshop on Geometric and Relational Deep Learning
https://geometric-relational-dl.github.io
https://geometric-relational-dl.github.io
Fresh picks from ArXiv
This week presents new graph datasets OGB, accepted papers at ACL and SIGIR and a survey on Winograd challenge π
GNN
β’ Open Graph Benchmark: Datasets for Machine Learning on Graphs with Jure Leskovec
β’ Low-Dimensional Hyperbolic Knowledge Graph Embeddings with group Christopher RΓ©
β’ Graph Homomorphism Convolution
Graph Theory
β’ Independent Set on Pk-Free Graphs in Quasi-Polynomial Time
β’ Tree-depth and the Formula Complexity of Subgraph Isomorphism
Conferences
β’ Alleviating the Inconsistency Problem of Applying Graph Neural Network to Fraud Detection SIGIR 20
β’ Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward ACL 20
β’ Bipartite Flat-Graph Network for Nested Named Entity Recognition ACL 20
β’ LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network ACL 20
β’ A Review of Winograd Schema Challenge Datasets and Approaches IJCAI 20
This week presents new graph datasets OGB, accepted papers at ACL and SIGIR and a survey on Winograd challenge π
GNN
β’ Open Graph Benchmark: Datasets for Machine Learning on Graphs with Jure Leskovec
β’ Low-Dimensional Hyperbolic Knowledge Graph Embeddings with group Christopher RΓ©
β’ Graph Homomorphism Convolution
Graph Theory
β’ Independent Set on Pk-Free Graphs in Quasi-Polynomial Time
β’ Tree-depth and the Formula Complexity of Subgraph Isomorphism
Conferences
β’ Alleviating the Inconsistency Problem of Applying Graph Neural Network to Fraud Detection SIGIR 20
β’ Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward ACL 20
β’ Bipartite Flat-Graph Network for Nested Named Entity Recognition ACL 20
β’ LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network ACL 20
β’ A Review of Winograd Schema Challenge Datasets and Approaches IJCAI 20
KDD 2020: Workshop on Deep Learning on Graphs
If you miss ICML deadlines, there is another good workshop for GML at KDD.
Deadline: 15 June
5 pages, double-blind
If you miss ICML deadlines, there is another good workshop for GML at KDD.
Deadline: 15 June
5 pages, double-blind
Graph Representation Learning for Algorithmic Reasoning
Another idea coming more frequently in recent graph papers is to learn particular graph algorithm such as Bellman-Ford or Breadth-First Search, instead of doing node classification or link prediction. Here is a video from WebConf'20 by Petar VeliΔkoviΔ (DeepMind) motivating this approach.
Another idea coming more frequently in recent graph papers is to learn particular graph algorithm such as Bellman-Ford or Breadth-First Search, instead of doing node classification or link prediction. Here is a video from WebConf'20 by Petar VeliΔkoviΔ (DeepMind) motivating this approach.
YouTube
Graph Representation Learning for Algorithmic Reasoning
Slide deck: https://petar-v.com/talks/Algo-WWW.pdf
Graph Machine Learning research groups: Michael Bronstein
I do a series of posts on the groups in graph research. The fifth is Michael Bronstein. He founded a company Fabula AI on detecting fake news in social networks, which was acquired by Twitter. Also, he was a committee member of my PhD defense π
Michael Bronstein (1980)
- Affiliation: Imperial College London; Twitter
- Education: Ph.D. at Israel Institute of Technology in Israel in 2007 (supervised by Ron Kimmel);
- h-index: 61;
- Awards: IEEE and IARP fellow, Dalle Molle prize, Royal Society Wolfson Merit award;
- Interests: computer graphics, geometrical deep learning, graph neural networks.
I do a series of posts on the groups in graph research. The fifth is Michael Bronstein. He founded a company Fabula AI on detecting fake news in social networks, which was acquired by Twitter. Also, he was a committee member of my PhD defense π
Michael Bronstein (1980)
- Affiliation: Imperial College London; Twitter
- Education: Ph.D. at Israel Institute of Technology in Israel in 2007 (supervised by Ron Kimmel);
- h-index: 61;
- Awards: IEEE and IARP fellow, Dalle Molle prize, Royal Society Wolfson Merit award;
- Interests: computer graphics, geometrical deep learning, graph neural networks.
profiles.imperial.ac.uk
Michael Bronstein | About | Imperial College London
View the Imperial College London profile of Michael Bronstein. Including their publications and grants.
Max Welling Talk GNN
I recently thought about what are other types of GNN exist beyond message-passing. I think one of them can be equivariant networks, i.e. neural networks that have permutation-equivariant properties, but I think there are other possible powerful graph models that are yet to be discovered.
In this video, Max Welling discusses his recent works on equivariant NNs for meshes and factor GNNs.
I recently thought about what are other types of GNN exist beyond message-passing. I think one of them can be equivariant networks, i.e. neural networks that have permutation-equivariant properties, but I think there are other possible powerful graph models that are yet to be discovered.
In this video, Max Welling discusses his recent works on equivariant NNs for meshes and factor GNNs.
YouTube
MIT Talk GNNs May 08 2020
Injecting Inductive Bias in Graph Neural Networks:
Equivariant Mesh Neural Networks and Neural Augmented (Factor) Graph Neural Networks.
Co-authors Mesh-NNs: Pim de Haan, Maurice Weiler and Taco Cohen
Co-author Neural Augmented Factor Graph NNs: Victor Garciaβ¦
Equivariant Mesh Neural Networks and Neural Augmented (Factor) Graph Neural Networks.
Co-authors Mesh-NNs: Pim de Haan, Maurice Weiler and Taco Cohen
Co-author Neural Augmented Factor Graph NNs: Victor Garciaβ¦
Fresh picks from ArXiv
It's Tuesday and so it means we look back at the previous week of ArXiv. In today's episode, among most interesting papers, a new knowledge graph for PubMed π and new surveys on graph machine learning and quantum deep learning βοΈ
Applications
β’ Building a PubMed knowledge graph
β’ Reinforcement Learning with Feedback Graphs
β’ Predicting gene expression from network topology using graph neural networks
β’ On new record graphs close to bipartite Moore graphs
Conferences
β’ Bundle Recommendation with Graph Convolutional Networks SIGIR 20
β’ TAGNN: Target Attentive Graph Neural Networks for Session-based Recommendation SIGIR 20
β’ Adversarial Graph Embeddings for Fair Influence Maximization over Social Networks IJCAI 20
β’ Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics ACL 20
Survey
β’ Machine Learning on Graphs: A Model and Comprehensive Taxonomy with Christopher RΓ©
β’ Comparison and Benchmark of Graph Clustering Algorithms
β’ Advances in Quantum Deep Learning: An Overview
It's Tuesday and so it means we look back at the previous week of ArXiv. In today's episode, among most interesting papers, a new knowledge graph for PubMed π and new surveys on graph machine learning and quantum deep learning βοΈ
Applications
β’ Building a PubMed knowledge graph
β’ Reinforcement Learning with Feedback Graphs
β’ Predicting gene expression from network topology using graph neural networks
β’ On new record graphs close to bipartite Moore graphs
Conferences
β’ Bundle Recommendation with Graph Convolutional Networks SIGIR 20
β’ TAGNN: Target Attentive Graph Neural Networks for Session-based Recommendation SIGIR 20
β’ Adversarial Graph Embeddings for Fair Influence Maximization over Social Networks IJCAI 20
β’ Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics ACL 20
Survey
β’ Machine Learning on Graphs: A Model and Comprehensive Taxonomy with Christopher RΓ©
β’ Comparison and Benchmark of Graph Clustering Algorithms
β’ Advances in Quantum Deep Learning: An Overview
PhD Theses on Graph Machine Learning
Here are some PhD dissertations on GML (including mine).
Nino Shervashidze: Scalable graph kernels
Petar VeliΔkoviΔ: The resurgence of structure in deep neural networks
Sergei Ivanov: Combinatorial and neural graph vector representations
Thomas Kipf: Deep learning with graph-structured representations
Here are some PhD dissertations on GML (including mine).
Nino Shervashidze: Scalable graph kernels
Petar VeliΔkoviΔ: The resurgence of structure in deep neural networks
Sergei Ivanov: Combinatorial and neural graph vector representations
Thomas Kipf: Deep learning with graph-structured representations
dare.uva.nl
Digital Academic Repository - University of Amsterdam
Introduction to Deep Learning (I2DL)
There is a course on deep learning by Technical University of Munich. Recordings, slides, and exercises are available online.
There is a course on deep learning by Technical University of Munich. Recordings, slides, and exercises are available online.
Secrets of the Surface: The Mathematical Vision of Maryam Mirzakhani
There is a documentary that you can watch on the life of Maryam Mirzakhani. In 2014, she was awarded the Fields medal for the work in "the dynamics and geometry of Riemann surfaces and their moduli spaces." You can read about her in this article. For the film, you can register here and they will send a link to the Vimeo, which will be available until 19th May.
There is a documentary that you can watch on the life of Maryam Mirzakhani. In 2014, she was awarded the Fields medal for the work in "the dynamics and geometry of Riemann surfaces and their moduli spaces." You can read about her in this article. For the film, you can register here and they will send a link to the Vimeo, which will be available until 19th May.
Quanta Magazine
A Tenacious Explorer of Abstract Surfaces
Maryam Mirzakhani, who became the first woman Fields medalist for drawing deep connections between topology, geometry and dynamical systems, has died of cancer at the age of 40. This is our 2014β¦
AI and Theorem Proving
One of the topics that caught my attention was on using AI to automate theorem proving. Apparently, there is already a conference on this. At ICLR there was a paper on using graph networks for theorem proving.
I think besides this conference, which mainly explores how you can model mathematical logic using embeddings, another type of theorem proving is on smart pruning of combinatorial spaces (e.g. you have large space of graphs, from which you need to pick some particular examples).
One of the topics that caught my attention was on using AI to automate theorem proving. Apparently, there is already a conference on this. At ICLR there was a paper on using graph networks for theorem proving.
I think besides this conference, which mainly explores how you can model mathematical logic using embeddings, another type of theorem proving is on smart pruning of combinatorial spaces (e.g. you have large space of graphs, from which you need to pick some particular examples).
Learning graph structure to help classification
I just recently discussed an idea whether it's possible to create a graph from a non-graph classification data set and improve classification performance by doing it and I found two works on it.
First approach just tries different values for knn to connect the points into a graph, obtains a graph for each parameter setting, and verifies the performance of classification of a graph model on the obtained graph. Clearly the problem with it is that you have to do classification many many times for different parameters of your knn.
Second approach (ICML 2019, link to presentation) is more data-driven: instead of freezing the parameters of knn, it uses a graph generative model that would generate a graph from the points. Then a graph neural network would make a classification prediction and, together with parameters of generative model, would be updated by backpropagation. It's still quite heavy as you need to update parameters of two different models instead of one. Perhaps, there are future works that would create the graphs at little computational cost and would boost the results for classification pipelines.
I just recently discussed an idea whether it's possible to create a graph from a non-graph classification data set and improve classification performance by doing it and I found two works on it.
First approach just tries different values for knn to connect the points into a graph, obtains a graph for each parameter setting, and verifies the performance of classification of a graph model on the obtained graph. Clearly the problem with it is that you have to do classification many many times for different parameters of your knn.
Second approach (ICML 2019, link to presentation) is more data-driven: instead of freezing the parameters of knn, it uses a graph generative model that would generate a graph from the points. Then a graph neural network would make a classification prediction and, together with parameters of generative model, would be updated by backpropagation. It's still quite heavy as you need to update parameters of two different models instead of one. Perhaps, there are future works that would create the graphs at little computational cost and would boost the results for classification pipelines.
Fresh picks from ArXiv
This week highlights papers on scene graph generation, isomorphism testing by GNNs and WL, and weight estimation of pork cuts π
GNN
β’ How hard is graph isomorphism for graph neural networks? by Andreas Loukas
β’ Neural Stochastic Block Model & Scalable Community-Based Graph Learning
β’ Graph Density-Aware Losses for Novel Compositions in Scene Graph Generation
β’ Structured Query-Based Image Retrieval Using Scene Graphs
β’ Isometric Transformation Invariant and Equivariant Graph Convolutional Networks
β’ SpectralWeight: a spectral graph wavelet framework for weight prediction of pork cuts
Conferences
β’ Integrating Semantic and Structural Information with Graph Convolutional Network for Controversy Detection ACL 20
β’ GoGNN: Graph of Graphs Neural Network for Predicting Structured Entity Interactions IJCAI 20
β’ Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension ACL 20
Graph Theory
β’ Acyclic edge coloring conjecture is true on planar graphs without intersecting triangles
β’ The Weifeiler-Leman Algorithm and Recognition of Graph Properties
Surveys
β’ Visual Relationship Detection using Scene Graphs: A Survey
β’ Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey
β’ Recent Advances in SQL Query Generation: A Survey
β’ Explainable Reinforcement Learning: A Survey
This week highlights papers on scene graph generation, isomorphism testing by GNNs and WL, and weight estimation of pork cuts π
GNN
β’ How hard is graph isomorphism for graph neural networks? by Andreas Loukas
β’ Neural Stochastic Block Model & Scalable Community-Based Graph Learning
β’ Graph Density-Aware Losses for Novel Compositions in Scene Graph Generation
β’ Structured Query-Based Image Retrieval Using Scene Graphs
β’ Isometric Transformation Invariant and Equivariant Graph Convolutional Networks
β’ SpectralWeight: a spectral graph wavelet framework for weight prediction of pork cuts
Conferences
β’ Integrating Semantic and Structural Information with Graph Convolutional Network for Controversy Detection ACL 20
β’ GoGNN: Graph of Graphs Neural Network for Predicting Structured Entity Interactions IJCAI 20
β’ Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension ACL 20
Graph Theory
β’ Acyclic edge coloring conjecture is true on planar graphs without intersecting triangles
β’ The Weifeiler-Leman Algorithm and Recognition of Graph Properties
Surveys
β’ Visual Relationship Detection using Scene Graphs: A Survey
β’ Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey
β’ Recent Advances in SQL Query Generation: A Survey
β’ Explainable Reinforcement Learning: A Survey
ACL 2020 stats
ACL (Association for Computational Linguistics) is the top conferences in NLP. Each year there are more and more papers that use graphs with natural language.
Dates: July 5-10
Where: Online
β’ 3088 submissions (2906 submissions in 2019)
β’ 571/208 long and short papers (25% overall acceptance rate; 447/213 in 2019)
β’ 42/6 long/short graph papers (7%/3% of total)
ACL (Association for Computational Linguistics) is the top conferences in NLP. Each year there are more and more papers that use graphs with natural language.
Dates: July 5-10
Where: Online
β’ 3088 submissions (2906 submissions in 2019)
β’ 571/208 long and short papers (25% overall acceptance rate; 447/213 in 2019)
β’ 42/6 long/short graph papers (7%/3% of total)
ACL 2020
The 58th Annual Meeting of the Association for Computational Linguistics
CVPR 2020
Computer Vision and Pattern Recognition (CVPR) conference is the top conference in CV and has had increasing focus on application of graphs to images. Here are some facts about CVPR 2020.
Papers link
Dates: June 14-19
Where: Virtual
β’ 6,656 total papers
β’ 1,470 accepted papers
β’ 22% acceptance rate
β’ ~69 graph papers (~5% of total)
Computer Vision and Pattern Recognition (CVPR) conference is the top conference in CV and has had increasing focus on application of graphs to images. Here are some facts about CVPR 2020.
Papers link
Dates: June 14-19
Where: Virtual
β’ 6,656 total papers
β’ 1,470 accepted papers
β’ 22% acceptance rate
β’ ~69 graph papers (~5% of total)
Graph Machine Learning research groups: Christos Faloutsos
I do a series of posts on the groups in graph research. The sixth is Christos Faloutsos. He was an advisor for many current professors in GML such as Jure Leskovec, Leman Akoglu, Stephan Guennemman, and Bruno Ribeiro.
Christos Faloutsos (~1960)
- Affiliation: Carnegie Mellon University; Amazon
- Education: Ph.D. at University of Toronto in 1987 (supervised by Stavros Christodoulakis);
- h-index: 131;
- Awards: ACM fellow, best paper awards at KDD, SIGMOD, ICDM;
- Interests: data mining; database; anomaly detection in graphs.
I do a series of posts on the groups in graph research. The sixth is Christos Faloutsos. He was an advisor for many current professors in GML such as Jure Leskovec, Leman Akoglu, Stephan Guennemman, and Bruno Ribeiro.
Christos Faloutsos (~1960)
- Affiliation: Carnegie Mellon University; Amazon
- Education: Ph.D. at University of Toronto in 1987 (supervised by Stavros Christodoulakis);
- h-index: 131;
- Awards: ACM fellow, best paper awards at KDD, SIGMOD, ICDM;
- Interests: data mining; database; anomaly detection in graphs.
Social network data set
Anton @xgfsru shared a data set VK1M (password 1234), with first 1M users from social network vk.com (data is taken via public API). In addition to the friends of each user, the file contains meta-information such as education, country, birthday of each node. It can be useful for node classification or regression tasks as well as community or anomaly detection.
Anton @xgfsru shared a data set VK1M (password 1234), with first 1M users from social network vk.com (data is taken via public API). In addition to the friends of each user, the file contains meta-information such as education, country, birthday of each node. It can be useful for node classification or regression tasks as well as community or anomaly detection.
mega.nz
File on MEGA
Fresh picks from ArXiv
This week has more papers from upcoming ACL, SIGIR, and KDD; a new survey on combinatorial optimization on graphs; and Borsuk's conjecture πΉ
Conferences
β’ Joint Item Recommendation and Attribute Inference: An Adaptive Graph Convolutional Network Approach SIGIR 20
β’ ATBRG: Adaptive Target-Behavior Relational Graph Network for Effective Recommendation SIGIR 20
β’ Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks KDD 20
β’ Understanding Negative Sampling in Graph Representation Learning KDD 20
β’ Leveraging Graph to Improve Abstractive Multi-Document Summarization ACL 20
β’ M2GRL: A Multi-task Multi-view Graph Representation Learning Framework for Web-scale Recommender Systems KDD 20
β’ Graph Structure Learning for Robust Graph Neural Networks KDD 20
Surveys
β’ Learning Combinatorial Optimization on Graphs: A Survey with Applications to Networking
β’ How to Build a Graph-Based Deep Learning Architecture in Traffic Domain: A Survey
β’ Motif Discovery Algorithms in Static and Temporal Networks: A Survey
Graph Theory
β’ The Weisfeiler-Leman dimension of distance-hereditary graphs
β’ Counterexamples to Borsuk's conjecture from a third strongly regular graph
β’ A Group-Theoretic Framework for Knowledge Graph Embedding
This week has more papers from upcoming ACL, SIGIR, and KDD; a new survey on combinatorial optimization on graphs; and Borsuk's conjecture πΉ
Conferences
β’ Joint Item Recommendation and Attribute Inference: An Adaptive Graph Convolutional Network Approach SIGIR 20
β’ ATBRG: Adaptive Target-Behavior Relational Graph Network for Effective Recommendation SIGIR 20
β’ Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks KDD 20
β’ Understanding Negative Sampling in Graph Representation Learning KDD 20
β’ Leveraging Graph to Improve Abstractive Multi-Document Summarization ACL 20
β’ M2GRL: A Multi-task Multi-view Graph Representation Learning Framework for Web-scale Recommender Systems KDD 20
β’ Graph Structure Learning for Robust Graph Neural Networks KDD 20
Surveys
β’ Learning Combinatorial Optimization on Graphs: A Survey with Applications to Networking
β’ How to Build a Graph-Based Deep Learning Architecture in Traffic Domain: A Survey
β’ Motif Discovery Algorithms in Static and Temporal Networks: A Survey
Graph Theory
β’ The Weisfeiler-Leman dimension of distance-hereditary graphs
β’ Counterexamples to Borsuk's conjecture from a third strongly regular graph
β’ A Group-Theoretic Framework for Knowledge Graph Embedding