Graph Machine Learning research groups: Karsten Borgwardt
I do a series of posts on the groups in graph research. The third is Karsten Borgwardt. His group at ETH Zurich is working on computational biology and his lab is well known for development of some of commonly used graph kernels (e.g. Weisfeiler-Leman, graphlet, shortest path).
Karsten Borgwardt (1980)
- Affiliation: ETH Zurich;
- Education: Ph.D. at Ludwig-Maximilians-University in Munich in 2007 (supervised by Hans-Peter Kriegel);
- h-index: 47;
- Awards: Krupp Award, best papers at NeurIPS;
- Interests: graph kernels; molecular graph representation; computational biology
I do a series of posts on the groups in graph research. The third is Karsten Borgwardt. His group at ETH Zurich is working on computational biology and his lab is well known for development of some of commonly used graph kernels (e.g. Weisfeiler-Leman, graphlet, shortest path).
Karsten Borgwardt (1980)
- Affiliation: ETH Zurich;
- Education: Ph.D. at Ludwig-Maximilians-University in Munich in 2007 (supervised by Hans-Peter Kriegel);
- h-index: 47;
- Awards: Krupp Award, best papers at NeurIPS;
- Interests: graph kernels; molecular graph representation; computational biology
Machine Learning & Computational Biology Lab
Homepage - Machine Learning and Computational Biology Lab
Online Math Seminars
There is a list of upcoming math seminars around the world, that are due to the confinement will be organized online and can be accessed by anyone. Topics are broad, including combinatorics, group theory and graphs.
There is a list of upcoming math seminars around the world, that are due to the confinement will be organized online and can be accessed by anyone. Topics are broad, including combinatorics, group theory and graphs.
researchseminars.org
researchseminars.org - Browse talks
Welcome to researchseminars.org, a list of research seminars and conferences!
Fresh picks from ArXiv
This week features SIDMOD and CVPR accepted papers + ICML submissions; graph embeddings for text classification and surveys on reconstruction conjecture and double descent.
SIGMOD
• Exact Single-Source SimRank Computation on Large Graphs
• An Algorithm for Context-Free Path Queries over Graph Databases
• A1: A Distributed In-Memory Graph Database — Bing's graph database
GNN
• Principal Neighbourhood Aggregation for Graph Nets with Petar Veličković; on the injection with continuous features
• VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification — application to NLP
CVPR
• Semantic Image Manipulation Using Scene Graphs
ICML
• Learning to Recognize Spatial Configurations of Objects with Graph Neural Networks
• Graph Highway Networks
• The general theory of permutation equivarant neural networks and higher order graph variational encoders
Graph Theory
• On reconstruction of graphs from the multiset of subgraphs obtained by deleting ℓ vertices
Survey
• A Brief Prehistory of Double Descent
This week features SIDMOD and CVPR accepted papers + ICML submissions; graph embeddings for text classification and surveys on reconstruction conjecture and double descent.
SIGMOD
• Exact Single-Source SimRank Computation on Large Graphs
• An Algorithm for Context-Free Path Queries over Graph Databases
• A1: A Distributed In-Memory Graph Database — Bing's graph database
GNN
• Principal Neighbourhood Aggregation for Graph Nets with Petar Veličković; on the injection with continuous features
• VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification — application to NLP
CVPR
• Semantic Image Manipulation Using Scene Graphs
ICML
• Learning to Recognize Spatial Configurations of Objects with Graph Neural Networks
• Graph Highway Networks
• The general theory of permutation equivarant neural networks and higher order graph variational encoders
Graph Theory
• On reconstruction of graphs from the multiset of subgraphs obtained by deleting ℓ vertices
Survey
• A Brief Prehistory of Double Descent
KDD Cup 2020 AutoGraph Challenge
There is a kaggle-like challenge until 25th May on automl for graph embeddings. 5 datasets, $33.5K prize pool, node classification task, accuracy metric, time constraints on train and predict.
There is a kaggle-like challenge until 25th May on automl for graph embeddings. 5 datasets, $33.5K prize pool, node classification task, accuracy metric, time constraints on train and predict.
Discovering automorphism of a graph and its deck.
My hypothesis was that if you take some hard graph for graph isomorphism problem and remove one vertex, then the resulted graph will be much easier because the symmetry of the original graph will be broken. So I took the hardest known graphs for graph isomorphism and checked how much time it takes for determining automorphism group (which is similar to how hard it would be to run isomorphism test).
The results are quite interesting. Indeed for many subgraphs determining automorphism is 50-100 times easier. But surprisingly, there are subgraphs which are harder than the original graph.
In the image below, you can see results for 6 graphs from cfi-rigid-z2 with ~1000-2000 vertices and checked the runtime for the original graph and all possible subgraphs by deleting one vertex. You can see that while for the first two graphs (first row), all subgraphs are easier, for the next 4 graphs, there are smaller subgraphs that take 5x more time to solve that the original graph.
This could happen for three reasons: (1) nauty solver has some heuristics that worked better for the original graph than for the subgraphs (2) not stable running and rerunning it would result in different runtimes and (3) smaller graphs somehow indeed harder than the original graph. I think (2) is very unlikely and my guess is a combination of (1) and (3): removing a vertex makes equivalent vertices in the original graph to be non-equivalent in the subgraph which reduces the amount of pruning nauty does.
My hypothesis was that if you take some hard graph for graph isomorphism problem and remove one vertex, then the resulted graph will be much easier because the symmetry of the original graph will be broken. So I took the hardest known graphs for graph isomorphism and checked how much time it takes for determining automorphism group (which is similar to how hard it would be to run isomorphism test).
The results are quite interesting. Indeed for many subgraphs determining automorphism is 50-100 times easier. But surprisingly, there are subgraphs which are harder than the original graph.
In the image below, you can see results for 6 graphs from cfi-rigid-z2 with ~1000-2000 vertices and checked the runtime for the original graph and all possible subgraphs by deleting one vertex. You can see that while for the first two graphs (first row), all subgraphs are easier, for the next 4 graphs, there are smaller subgraphs that take 5x more time to solve that the original graph.
This could happen for three reasons: (1) nauty solver has some heuristics that worked better for the original graph than for the subgraphs (2) not stable running and rerunning it would result in different runtimes and (3) smaller graphs somehow indeed harder than the original graph. I think (2) is very unlikely and my guess is a combination of (1) and (3): removing a vertex makes equivalent vertices in the original graph to be non-equivalent in the subgraph which reduces the amount of pruning nauty does.
RWTH AACHEN UNIVERSITY
Benchmark Graphs for Practical Graph Isomorphism
The state-of-the-art solvers for the graph isomorphism problem (e.g. bliss , nauty/traces , conauto , saucy , etc.) can readily solve generic instances with tens of thousands of vertices. Indeed, experiments show that on inputs without particular combinatorial…
Some notes on visualization of the graphs
Stephen Wolfram, creator of Wolfram language, recently made a post Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful, where he discusses possible origins of how the universe operates. I think the crux of his idea is that if you consider interactions between objects as a graph and then say how from some interactions appear new interactions you can get beautifully looking graphs that look like some 3D shapes which can represent our universe in the limit and therefore you can analyze properties of these graphs such as diameter or curvature to find equivalent notions in physics.
I won't speculate whether this post is theoretically-sound or not, let physicists debate, in the end any new theory should predict new facts which we need to wait, but one thing that is noticeable is that graphs that are drawn by Wolfram are beautiful. If you try to draw some big graphs you find that it very hard to draw it so that it does not look like a mess, but here you get pretty-looking networks that indeed remind you some known 3D shapes. Wolfram language has many layouts to draw graphs, which results in different images of the graph. From the shapes of the graphs in the post, it seems that he used SpectralEmbedding or SpringElectricalEmbedding layout. Daniel Spielman, professor at Yale and twice Turing award winner, has a nice popsci video where he discusses how these drawings are related to spectral graph theory and the conditions on the adjacency matrix to have a nice drawing. So maybe next time you will use some of these layouts to impress reviewers of your paper.
Stephen Wolfram, creator of Wolfram language, recently made a post Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful, where he discusses possible origins of how the universe operates. I think the crux of his idea is that if you consider interactions between objects as a graph and then say how from some interactions appear new interactions you can get beautifully looking graphs that look like some 3D shapes which can represent our universe in the limit and therefore you can analyze properties of these graphs such as diameter or curvature to find equivalent notions in physics.
I won't speculate whether this post is theoretically-sound or not, let physicists debate, in the end any new theory should predict new facts which we need to wait, but one thing that is noticeable is that graphs that are drawn by Wolfram are beautiful. If you try to draw some big graphs you find that it very hard to draw it so that it does not look like a mess, but here you get pretty-looking networks that indeed remind you some known 3D shapes. Wolfram language has many layouts to draw graphs, which results in different images of the graph. From the shapes of the graphs in the post, it seems that he used SpectralEmbedding or SpringElectricalEmbedding layout. Daniel Spielman, professor at Yale and twice Turing award winner, has a nice popsci video where he discusses how these drawings are related to spectral graph theory and the conditions on the adjacency matrix to have a nice drawing. So maybe next time you will use some of these layouts to impress reviewers of your paper.
Discrete Differential Geometrical Course
CS 15-458/858: Discrete Differential Geometry (Spring 2020) at Carnegie Mellon University. The lectures are available at YouTube. Discussions of Laplace operator, smooth and discrete surfaces, curvatures, etc.
CS 15-458/858: Discrete Differential Geometry (Spring 2020) at Carnegie Mellon University. The lectures are available at YouTube. Discussions of Laplace operator, smooth and discrete surfaces, curvatures, etc.
YouTube
Discrete Differential Geometry - CMU 15-458/858
Lecture videos for an intro course on Discrete Differential Geometry at Carnegie Mellon University (15-458).
April Arxiv: how many graphs papers?
From 18 March to 17 April there were 300 new and 108 updated papers in ArXiv CS section. This is around 50 papers less that in the previous period.
From 18 March to 17 April there were 300 new and 108 updated papers in ArXiv CS section. This is around 50 papers less that in the previous period.
Web Conference 2020
This week, fully-virtual, the Web Conference 2020 will take place. It will last for 5 days, you can still register (~200 USD).
There are tracks on social networks, semantics (KG), and user modeling, which often deal with graphs.
About every third paper is on graphs.
There will be 4 tutorials and 2 workshops on graphs (Monday-Tuesday), which I described in this post.
This week, fully-virtual, the Web Conference 2020 will take place. It will last for 5 days, you can still register (~200 USD).
There are tracks on social networks, semantics (KG), and user modeling, which often deal with graphs.
About every third paper is on graphs.
There will be 4 tutorials and 2 workshops on graphs (Monday-Tuesday), which I described in this post.
Telegram
Graph Machine Learning
Graph Tutorials at WebConf 2020
Entity Summarization in Knowledge Graphs: Algorithms, Evaluation, and Applications by Nanjing University, Samsung, Bosch
Learning Graph Neural Networks with Deep Graph Library by Amazon
Constructing Knowledge Graph for Social…
Entity Summarization in Knowledge Graphs: Algorithms, Evaluation, and Applications by Nanjing University, Samsung, Bosch
Learning Graph Neural Networks with Deep Graph Library by Amazon
Constructing Knowledge Graph for Social…
Fresh picks from ArXiv
Ever wondered what color of the sofa to choose to be compatible with the rest of the furniture? 🛋 Today you will find an answer in one of the papers and of course with the help of GNN. Besides, there is a survey on 6G (!) technologies 📱, new theoretical result on graph isomorphism 🎓, and many applications of graphs.
Applications
The Quantum Approximate Optimization Algorithm Needs to See the Whole Graph: A Typical Case — quantum computation
Recommendation system using a deep learning and graph analysis approach — recommendation
Learning Furniture Compatibility with Graph Neural Networks — interior design
Gumbel-softmax-based Optimization: A Simple General Framework for Optimization Problems on Graphs —combinatorial optimization
Knowledge graphs
DGL-KE: Training Knowledge Graph Embeddings at Scale
Dynamic Knowledge Graph-based Dialogue Generation with Improved Adversarial Meta-Learning
Layered Graph Embedding for Entity Recommendation using Wikipedia in the Yahoo! Knowledge Graph
Survey
A Survey of 6G Wireless Communications: Emerging Technologies
Duplication Detection in Knowledge Graphs: Literature and Tools
Graph theory
Isomorphism Testing for Graphs Excluding Small Minors — on graph isomorphism
Hitting forbidden induced subgraphs on bounded treewidth graphs — on treewidth
Low-stretch spanning trees of graphs with bounded width
Steiner Trees for Hereditary Graph Classes: a Treewidth Perspective
Ever wondered what color of the sofa to choose to be compatible with the rest of the furniture? 🛋 Today you will find an answer in one of the papers and of course with the help of GNN. Besides, there is a survey on 6G (!) technologies 📱, new theoretical result on graph isomorphism 🎓, and many applications of graphs.
Applications
The Quantum Approximate Optimization Algorithm Needs to See the Whole Graph: A Typical Case — quantum computation
Recommendation system using a deep learning and graph analysis approach — recommendation
Learning Furniture Compatibility with Graph Neural Networks — interior design
Gumbel-softmax-based Optimization: A Simple General Framework for Optimization Problems on Graphs —combinatorial optimization
Knowledge graphs
DGL-KE: Training Knowledge Graph Embeddings at Scale
Dynamic Knowledge Graph-based Dialogue Generation with Improved Adversarial Meta-Learning
Layered Graph Embedding for Entity Recommendation using Wikipedia in the Yahoo! Knowledge Graph
Survey
A Survey of 6G Wireless Communications: Emerging Technologies
Duplication Detection in Knowledge Graphs: Literature and Tools
Graph theory
Isomorphism Testing for Graphs Excluding Small Minors — on graph isomorphism
Hitting forbidden induced subgraphs on bounded treewidth graphs — on treewidth
Low-stretch spanning trees of graphs with bounded width
Steiner Trees for Hereditary Graph Classes: a Treewidth Perspective
A forgotten story of Soviet AI
I found out about Weisfeiler-Leman algorithm about 5 years ago, and then sometime after I realized that both authors were from the USSR. That was quite unexpected. I started looking up information about the authors and found quite a good biography of Boris Weisfeiler, written by his sister, and not so much about Andrey Leman. About one year I was searching the people who knew him, one by one, who are now quite senior and don't use all fancy messengers, to find out more about his life. Finally, I gathered enough to write a post on his life, from interest in math olympiads to development of the first AI chess player, to working in Silicon Valley.
His life is a symbol of generation of mathematicians of his time. Strong performance in math olympiads, competitive Moscow State University, working in the Institute of theoretical and experimental physics, and then emigration to the West, when the iron curtain collapsed. I like hearing these stories because it's reminiscent of stories of my parents and their friends-engineers. It's the voice of that time, that now is inevitably gone. Similar to the trip of Babai to the USSR, reading about these stories uncovers the foundations of graph theory, computer science and artificial intelligence that we study today and let us connect the dots between old and new approaches.
I found out about Weisfeiler-Leman algorithm about 5 years ago, and then sometime after I realized that both authors were from the USSR. That was quite unexpected. I started looking up information about the authors and found quite a good biography of Boris Weisfeiler, written by his sister, and not so much about Andrey Leman. About one year I was searching the people who knew him, one by one, who are now quite senior and don't use all fancy messengers, to find out more about his life. Finally, I gathered enough to write a post on his life, from interest in math olympiads to development of the first AI chess player, to working in Silicon Valley.
His life is a symbol of generation of mathematicians of his time. Strong performance in math olympiads, competitive Moscow State University, working in the Institute of theoretical and experimental physics, and then emigration to the West, when the iron curtain collapsed. I like hearing these stories because it's reminiscent of stories of my parents and their friends-engineers. It's the voice of that time, that now is inevitably gone. Similar to the trip of Babai to the USSR, reading about these stories uncovers the foundations of graph theory, computer science and artificial intelligence that we study today and let us connect the dots between old and new approaches.
Medium
A forgotten story of Soviet AI
What it was like to be a programmer 70 years ago?
Geometric and Relational Deep Learning
A workshop on GML will be streamed tomorrow on YouTube. It will start at 9-20 and continue until 17-00. The list of speakers is great:
Peter Battaglia, DeepMind
Natalia Neverova, Facebook AI Research (FAIR)
Stephan Günnemann, TU Munich
Yaron Lipman, Weizmann Institute of Science
Miltos Allamanis, Microsoft Research
Qi Liu, University of Oxford
Pim de Haan, University of Amsterdam & Qualcomm AI Research
Noemi Montobbio, Italian Institute of Technology
A workshop on GML will be streamed tomorrow on YouTube. It will start at 9-20 and continue until 17-00. The list of speakers is great:
Peter Battaglia, DeepMind
Natalia Neverova, Facebook AI Research (FAIR)
Stephan Günnemann, TU Munich
Yaron Lipman, Weizmann Institute of Science
Miltos Allamanis, Microsoft Research
Qi Liu, University of Oxford
Pim de Haan, University of Amsterdam & Qualcomm AI Research
Noemi Montobbio, Italian Institute of Technology
Geometric and Relational Deep Learning Pt. 2
Apparently in this workshop there will be also poster sessions that are only available to registered participants. The list of the papers can be found below (thanks to people who attend it).
• A geometric deep learning model to filter out anatomically non plausible fibers from tractograms [video]
• Patient-Specific Pathological Gait Modelling with Conditional-NRI [video]
• GRASP: Graph Alignment through Spectral Signatures
• Isomorphism Leakage in Multi-Interaction Datasets [video]
• Integrating Spectral and Spatial Domain Graph Neural Networks
• Unshackling Bisimulation with Graph Neural Networks
• State2vec: Learning Off-Policy State Representations [video]
• Are Graph Convolutional Networks Fully Exploiting Graph Structure? [video]
• Principal Neighbourhood Aggregation Networks [video]
• Attentive Group Equivariant Convolutional Networks [video]
• SMP: An Equivariant Message Passing Scheme for Learning Graph Structural Information [video]
• Evaluation of Molecular Fingerprints for Similarity-based Virtual Screening generated through Graph Convolution Networks [video]
• Network alignment with GNN [video]
• Learning Generative Models across Incomparable Spaces
• Learning Set Operations for Deformable Shapes [video]
• Instant recovery of shape from spectrum via latent space connections [video]
• SIGN: Scalable Inception Graph Neural Networks [video]
• Universal Invariant and Equivariant Graph Neural Networks [video]
• Graph Convolutional Gaussian Processes for Link Prediction [video]
• Deep Graph Mapper: Seeing Graphs through the Neural Lens [video]
• Geoopt: Riemannian Optimization in PyTorch [video]
• HyperLearn: A Distributed Approach for Representation Learning in Datasets With Many Modalities
• Multi-relational Poincaré Graph Embeddings [video]
• On Understanding Knowledge Graph Representation [video]
• Learning Object-Object Relations in Video [video]
Apparently in this workshop there will be also poster sessions that are only available to registered participants. The list of the papers can be found below (thanks to people who attend it).
• A geometric deep learning model to filter out anatomically non plausible fibers from tractograms [video]
• Patient-Specific Pathological Gait Modelling with Conditional-NRI [video]
• GRASP: Graph Alignment through Spectral Signatures
• Isomorphism Leakage in Multi-Interaction Datasets [video]
• Integrating Spectral and Spatial Domain Graph Neural Networks
• Unshackling Bisimulation with Graph Neural Networks
• State2vec: Learning Off-Policy State Representations [video]
• Are Graph Convolutional Networks Fully Exploiting Graph Structure? [video]
• Principal Neighbourhood Aggregation Networks [video]
• Attentive Group Equivariant Convolutional Networks [video]
• SMP: An Equivariant Message Passing Scheme for Learning Graph Structural Information [video]
• Evaluation of Molecular Fingerprints for Similarity-based Virtual Screening generated through Graph Convolution Networks [video]
• Network alignment with GNN [video]
• Learning Generative Models across Incomparable Spaces
• Learning Set Operations for Deformable Shapes [video]
• Instant recovery of shape from spectrum via latent space connections [video]
• SIGN: Scalable Inception Graph Neural Networks [video]
• Universal Invariant and Equivariant Graph Neural Networks [video]
• Graph Convolutional Gaussian Processes for Link Prediction [video]
• Deep Graph Mapper: Seeing Graphs through the Neural Lens [video]
• Geoopt: Riemannian Optimization in PyTorch [video]
• HyperLearn: A Distributed Approach for Representation Learning in Datasets With Many Modalities
• Multi-relational Poincaré Graph Embeddings [video]
• On Understanding Knowledge Graph Representation [video]
• Learning Object-Object Relations in Video [video]
YouTube
1: A geometric deep learning model to filter out anatomically non plausible fibers from tractograms
spotlight presentation @ ELLIS Workshop on Geometric Deep Learning
Graph Machine Learning research groups: Philip S. Yu
I do a series of posts on the groups in graph research. The fourth is Philip S. Yu. Fun fact: in 2019 he has 136 papers in google scholar (approximately one paper every 2.5 days).
Philip S. Yu (~1952)
- Affiliation: University of Illinois at Chicago;
- Education: Ph.D. at Stanford in 1978; MBA at NYU in 1982;
- h-index: 161;
- Awards: awards at KDD, IEEE CS, ICDM;
- Interests: data mining, anomaly detection on graphs, graph surveys, GNN.
I do a series of posts on the groups in graph research. The fourth is Philip S. Yu. Fun fact: in 2019 he has 136 papers in google scholar (approximately one paper every 2.5 days).
Philip S. Yu (~1952)
- Affiliation: University of Illinois at Chicago;
- Education: Ph.D. at Stanford in 1978; MBA at NYU in 1982;
- h-index: 161;
- Awards: awards at KDD, IEEE CS, ICDM;
- Interests: data mining, anomaly detection on graphs, graph surveys, GNN.
ICLR 2020
ICLR conference starts this Sunday with workshops, followed by 4 days of main conference. For those who registered, you can enter ICLR portal where all conference will take place (videos, chats, keynotes, etc.). They projected the papers with t-SNE so that you can quickly find relevant papers.
Finally, if you are interested in graph machine learning, revisit the post that I did early this year on Top Trends of Graph Machine Learning in 2020 that is based on 150 graph papers submitted/accepted to ICLR.
ICLR conference starts this Sunday with workshops, followed by 4 days of main conference. For those who registered, you can enter ICLR portal where all conference will take place (videos, chats, keynotes, etc.). They projected the papers with t-SNE so that you can quickly find relevant papers.
Finally, if you are interested in graph machine learning, revisit the post that I did early this year on Top Trends of Graph Machine Learning in 2020 that is based on 150 graph papers submitted/accepted to ICLR.
ICLR 2020, Day 1
Here is a list of interesting papers (and their links to ICLR portal) of Day 1.
1. On Universal Equivariant Set Networks portal link
2. Hoppity: Learning Graph Transformations to Detect and Fix Bugs in Programs portal link
3. GraphSAINT: Graph Sampling Based Inductive Learning Method portal link
4. Measuring and Improving the Use of Graph Information in Graph Neural Networks portal link
5. Deep Double Descent: Where Bigger Models and More Data Hurt portal link
6. GraphAF: a Flow-based Autoregressive Model for Molecular Graph Generation portal link
7. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
8. You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings portal link
9. Deep Graph Matching Consensus portal link
10. PairNorm: Tackling Oversmoothing in GNNs portal link
11. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
12. Making Efficient Use of Demonstrations to Solve Hard Exploration Problems portal link
13. LambdaNet: Probabilistic Type Inference using Graph Neural Networks portal link
14. StructPool: Structured Graph Pooling via Conditional Random Fields portal link
15. Implementation Matters in Deep RL: A Case Study on PPO and TRPO portal link
Here is a list of interesting papers (and their links to ICLR portal) of Day 1.
1. On Universal Equivariant Set Networks portal link
2. Hoppity: Learning Graph Transformations to Detect and Fix Bugs in Programs portal link
3. GraphSAINT: Graph Sampling Based Inductive Learning Method portal link
4. Measuring and Improving the Use of Graph Information in Graph Neural Networks portal link
5. Deep Double Descent: Where Bigger Models and More Data Hurt portal link
6. GraphAF: a Flow-based Autoregressive Model for Molecular Graph Generation portal link
7. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
8. You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings portal link
9. Deep Graph Matching Consensus portal link
10. PairNorm: Tackling Oversmoothing in GNNs portal link
11. Dynamically Pruned Message Passing Networks for Large-scale Knowledge Graph Reasoning portal link
12. Making Efficient Use of Demonstrations to Solve Hard Exploration Problems portal link
13. LambdaNet: Probabilistic Type Inference using Graph Neural Networks portal link
14. StructPool: Structured Graph Pooling via Conditional Random Fields portal link
15. Implementation Matters in Deep RL: A Case Study on PPO and TRPO portal link
iclr.cc
Login
ICLR Website
Knowledge graphs at ICLR 2020
Interested in knowledge graphs? Check out this fresh post on knowledge graphs at ICLR from Michael Galkin.
Interested in knowledge graphs? Check out this fresh post on knowledge graphs at ICLR from Michael Galkin.
Medium
Knowledge Graphs @ ICLR 2020
👋 Hello, I hope you are all doing well during the lockdown. ICLR 2020 went fully virtual, and here is a fully virtual article (well…
ICLR 2020, Day 2
Day 1 was great, each paper has a prerecorded 5-minute video and 2 slots when you can ask questions through Zoom. Very convenient.
Here is a list of interesting papers of Day 2.
1. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
2. Probability Calibration for Knowledge Graph Embedding Models portal link
3. Learning to Guide Random Search portal link
4. Directional Message Passing for Molecular Graphs portal link
5. Locally Constant Networks portal link
6. Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data portal link
7. On the Weaknesses of Reinforcement Learning for Neural Machine Translation portal link
8. Scale-Equivariant Steerable Networks portal link
9. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification portal link
10. Learning Heuristics for Quantified Boolean Formulas through Reinforcement Learning portal link
11. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
12. Memory-Based Graph Networks portal link
13. Are Transformers universal approximators of sequence-to-sequence functions? portal link
14. GLAD: Learning Sparse Graph Recovery portal link
15. Hyper-SAGNN: a self-attention based graph neural network for hypergraphs portal link
16. The Curious Case of Neural Text Degeneration portal link
17. Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering portal link
18. Global Relational Models of Source Code portal link
Day 1 was great, each paper has a prerecorded 5-minute video and 2 slots when you can ask questions through Zoom. Very convenient.
Here is a list of interesting papers of Day 2.
1. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
2. Probability Calibration for Knowledge Graph Embedding Models portal link
3. Learning to Guide Random Search portal link
4. Directional Message Passing for Molecular Graphs portal link
5. Locally Constant Networks portal link
6. Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data portal link
7. On the Weaknesses of Reinforcement Learning for Neural Machine Translation portal link
8. Scale-Equivariant Steerable Networks portal link
9. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification portal link
10. Learning Heuristics for Quantified Boolean Formulas through Reinforcement Learning portal link
11. Abstract Diagrammatic Reasoning with Multiplex Graph Networks portal link
12. Memory-Based Graph Networks portal link
13. Are Transformers universal approximators of sequence-to-sequence functions? portal link
14. GLAD: Learning Sparse Graph Recovery portal link
15. Hyper-SAGNN: a self-attention based graph neural network for hypergraphs portal link
16. The Curious Case of Neural Text Degeneration portal link
17. Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering portal link
18. Global Relational Models of Source Code portal link
iclr.cc
Login
ICLR Website
Fresh picks from ArXiv
This week has lots of applications of GNNs to recommender systems, new way to scale GNN, and a survey on deepfakes 🙃
Recommendation
• Contextualized Graph Attention Network for Recommendation with Item Knowledge Graph
• Learning Hierarchical Review Graph Representation for Recommendation
• Graph Learning Approaches to Recommender Systems: A Review with Philip S. Yu
• Learning Hierarchical Review Graph Representation for Recommendation
ACL
• GCAN: Graph-aware Co-Attention Networks for Explainable Fake News Detection on Social Media
• Relational Graph Attention Network for Aspect-based Sentiment Analysis
GNN
• SIGN: Scalable Inception Graph Neural Networks with Michael Bronstein
• Perturb More, Trap More: Understanding Behaviors of Graph Neural Networks
Graph Theory
• Coloring Problems on Bipartite Graphs of Small Diameter
• Graph isomorphism: Physical resources, optimization models, and algebraic characterizations
• On the spectral gap and the diameter of Cayley graphs
• Counterexamples to a conjecture by Gross, Mansour and Tucker on partial-dual genus polynomials of ribbon graphs
Survey
• The Creation and Detection of Deepfakes: A Survey
This week has lots of applications of GNNs to recommender systems, new way to scale GNN, and a survey on deepfakes 🙃
Recommendation
• Contextualized Graph Attention Network for Recommendation with Item Knowledge Graph
• Learning Hierarchical Review Graph Representation for Recommendation
• Graph Learning Approaches to Recommender Systems: A Review with Philip S. Yu
• Learning Hierarchical Review Graph Representation for Recommendation
ACL
• GCAN: Graph-aware Co-Attention Networks for Explainable Fake News Detection on Social Media
• Relational Graph Attention Network for Aspect-based Sentiment Analysis
GNN
• SIGN: Scalable Inception Graph Neural Networks with Michael Bronstein
• Perturb More, Trap More: Understanding Behaviors of Graph Neural Networks
Graph Theory
• Coloring Problems on Bipartite Graphs of Small Diameter
• Graph isomorphism: Physical resources, optimization models, and algebraic characterizations
• On the spectral gap and the diameter of Cayley graphs
• Counterexamples to a conjecture by Gross, Mansour and Tucker on partial-dual genus polynomials of ribbon graphs
Survey
• The Creation and Detection of Deepfakes: A Survey