Graph Machine Learning research groups: Jian Tang
I do a series of posts on the groups in graph research, previous post is here. The tenth is Jian Tang. A young professor at Mila, with 4 papers at ICML and 2 at ICLR in 2020, and looking for graduate students (in case you want to pursue PhD).
Jian Tang (~1987)
- Affiliation: Mila-Quebec AI Institute and HEC Montreal
- Education: Ph.D. at Peking University, Chine in 2014 (supervised by Ming Zhang);
- h-index: 19;
- Awards: Amazon faculty, best paper at KDD workshop;
- Interests: graph molecular design; graph generative models; knowledge graphs
I do a series of posts on the groups in graph research, previous post is here. The tenth is Jian Tang. A young professor at Mila, with 4 papers at ICML and 2 at ICLR in 2020, and looking for graduate students (in case you want to pursue PhD).
Jian Tang (~1987)
- Affiliation: Mila-Quebec AI Institute and HEC Montreal
- Education: Ph.D. at Peking University, Chine in 2014 (supervised by Ming Zhang);
- h-index: 19;
- Awards: Amazon faculty, best paper at KDD workshop;
- Interests: graph molecular design; graph generative models; knowledge graphs
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Stephan Günnemann
I do a series of posts on the groups in graph research, previous post is here. The nineth is Stephan Günnemann. His research interests include adversarial attacks on graphs and graph generative models.…
I do a series of posts on the groups in graph research, previous post is here. The nineth is Stephan Günnemann. His research interests include adversarial attacks on graphs and graph generative models.…
How Gödel’s Proof Works
Down-to-earth explanation by QuantaMagazine of how Gödel proved his famous incompleteness theorems. To many these results are profound illustration that mathematics is doomed to capture all real world effects: there are true facts about this world that we will never prove. One example of such "unprovable" problem is Continuum hypothesis: there is no infinite set in between set of integers (countable) and set of reals (uncountable).
In some sense, Gödel results killed mathematics and, luckily, paved the way for the emergence of computer science. There is a very good comics (I'm not a fan of comics, but this one about mathematics and short) called Logicomix that explains well the events that happened at the start of 20th century in mathematics, highlighting how many great thinkers such as Russell, Cantor, Boole, Wittgenstein, Hilbert, Frege, Poincaré, Gödel, and Turing approached the creation of new math and eventually failed.
Down-to-earth explanation by QuantaMagazine of how Gödel proved his famous incompleteness theorems. To many these results are profound illustration that mathematics is doomed to capture all real world effects: there are true facts about this world that we will never prove. One example of such "unprovable" problem is Continuum hypothesis: there is no infinite set in between set of integers (countable) and set of reals (uncountable).
In some sense, Gödel results killed mathematics and, luckily, paved the way for the emergence of computer science. There is a very good comics (I'm not a fan of comics, but this one about mathematics and short) called Logicomix that explains well the events that happened at the start of 20th century in mathematics, highlighting how many great thinkers such as Russell, Cantor, Boole, Wittgenstein, Hilbert, Frege, Poincaré, Gödel, and Turing approached the creation of new math and eventually failed.
Quanta Magazine
How Gödel’s Proof Works
His incompleteness theorems destroyed the search for a mathematical theory of everything. Nearly a century later, we’re still coming to grips with the consequences.
KDD 2020 stats
KDD is one of the biggest top conferences in GML.
Dates: Aug 23-27
Where: Online
Cost: $250
• 1279 research track / 756 applied data science track (vs 1200 (only research track) in 2019)
• 217 / 121 accepted (vs 170 in 2019)
• 16.9% / 16.0% acceptance rate (vs 14.2% in 2019)
• 70 / 14 graph papers (32% / 11% of total)
KDD is one of the biggest top conferences in GML.
Dates: Aug 23-27
Where: Online
Cost: $250
• 1279 research track / 756 applied data science track (vs 1200 (only research track) in 2019)
• 217 / 121 accepted (vs 170 in 2019)
• 16.9% / 16.0% acceptance rate (vs 14.2% in 2019)
• 70 / 14 graph papers (32% / 11% of total)
Fresh picks from ArXiv
This week presents accepted papers at ECCV 2020 and other conferences, new ways to overcome oversmoothing in deep GNNs, and a survey on internet of things 📱
GNN
• Towards Deeper Graph Neural Networks
• Natural Graph Networks with Max Welling
• Investigating Pretrained Language Models for Graph-to-Text Generation with Iryna Gurevych
• Optimizing Memory Placement using Evolutionary Graph Reinforcement Learning
Conferences
• Sketching Image Gist: Human-Mimetic Hierarchical Scene Graph Generation ECCV 2020
• Learn to Propagate Reliably on Noisy Affinity Graphs ECCV 2020
• SumGraph: Video Summarization via Recursive Graph Modeling ECCV 2020
• Graph-Based Social Relation Reasoning ECCV 2020
• Video Object Segmentation with Episodic Graph Memory Networks ECCV 2020
• GMNet: Graph Matching Network for Large Scale Part Semantic Segmentation in the Wild ECCV 2020
• SummPip: Unsupervised Multi-Document Summarization with Sentence Graph Compression SIGIR 2020
• A Graph-based Interactive Reasoning for Human-Object Interaction Detection IJCAI 2020
Survey
• Recommender Systems for the Internet of Things: A Survey
This week presents accepted papers at ECCV 2020 and other conferences, new ways to overcome oversmoothing in deep GNNs, and a survey on internet of things 📱
GNN
• Towards Deeper Graph Neural Networks
• Natural Graph Networks with Max Welling
• Investigating Pretrained Language Models for Graph-to-Text Generation with Iryna Gurevych
• Optimizing Memory Placement using Evolutionary Graph Reinforcement Learning
Conferences
• Sketching Image Gist: Human-Mimetic Hierarchical Scene Graph Generation ECCV 2020
• Learn to Propagate Reliably on Noisy Affinity Graphs ECCV 2020
• SumGraph: Video Summarization via Recursive Graph Modeling ECCV 2020
• Graph-Based Social Relation Reasoning ECCV 2020
• Video Object Segmentation with Episodic Graph Memory Networks ECCV 2020
• GMNet: Graph Matching Network for Large Scale Part Semantic Segmentation in the Wild ECCV 2020
• SummPip: Unsupervised Multi-Document Summarization with Sentence Graph Compression SIGIR 2020
• A Graph-based Interactive Reasoning for Human-Object Interaction Detection IJCAI 2020
Survey
• Recommender Systems for the Internet of Things: A Survey
Do we need deep graph neural networks?
A very frequent research question is discussed in a new blog post of Michael Bronstein. The problem with deep GNN is called over-smoothing and is related to the fact that with more layers GNN tends to produce embeddings that are equal across all nodes.
This problem started with AAAI'20 paper and now received a lot of attention (I'd say this is the 2nd most popular theoretical question about GNN after expressiveness), proposing different methods to tackle over-smoothing. It seems that if the graph/node labels depend on high-order information such as graphlets then the depth is necessary; however, encountering such data sets in real life may not be common,
A very frequent research question is discussed in a new blog post of Michael Bronstein. The problem with deep GNN is called over-smoothing and is related to the fact that with more layers GNN tends to produce embeddings that are equal across all nodes.
This problem started with AAAI'20 paper and now received a lot of attention (I'd say this is the 2nd most popular theoretical question about GNN after expressiveness), proposing different methods to tackle over-smoothing. It seems that if the graph/node labels depend on high-order information such as graphlets then the depth is necessary; however, encountering such data sets in real life may not be common,
Medium
Do we need deep graph neural networks?
Is “graph deep learning” a misnomer and is depth useful for graph neural networks?
ECCV 2020 stats
ECCV is among the best conferences in computer vision.
Dates: Aug 23-28
Where: Online
Cost: £150
Link to papers
• 5025 submissions (vs 2439 in 2019)
• 1361 accepted (vs 776 in 2019)
• 27.1% acceptance rate (vs 31.8% in 2018)
• 4 graph papers
ECCV is among the best conferences in computer vision.
Dates: Aug 23-28
Where: Online
Cost: £150
Link to papers
• 5025 submissions (vs 2439 in 2019)
• 1361 accepted (vs 776 in 2019)
• 27.1% acceptance rate (vs 31.8% in 2018)
• 4 graph papers
IJCAI 2020 stats
IJCAI moved its dates to Jan 2021.
Dates: Jan 2021
Where: Japan/Online
Link to papers
• 4717 submissions (vs 4752 in 2019)
• 592 accepted (vs 850 in 2019)
• 12.6% acceptance rate (vs 17.9% in 2018)
• 55 graph papers
IJCAI moved its dates to Jan 2021.
Dates: Jan 2021
Where: Japan/Online
Link to papers
• 4717 submissions (vs 4752 in 2019)
• 592 accepted (vs 850 in 2019)
• 12.6% acceptance rate (vs 17.9% in 2018)
• 55 graph papers
static.ijcai.org
IJCAI-PRICAI 2020 Accepted papers
Program schedule of IJCAI/ECAI 18
Opening-Remarks-GRL+.pdf
6.6 MB
Opening slides from GRL+ workshop (ICML 20) by Petar Veličković.
Trends in GML
I think GRL+ workshop is really cool: it gathers people in GML and discusses ideas that are not fully developed but will be soon. It's like peeking into the crystal ball. Petar Veličković, one of the organizers of this workshop, outlined the following trends:
- Emerging work on performance / scalability (e.g. SIGN, Weisfeiler & Leman go sparse)
- KG embeddings are as strong as ever (e.g. neural multi-hop reasoning, MPQE, Stay Positive, UniKER)
- proposal of many datasets/benchmarks/libraries (Wiki-CS, TUDataset, Spektral, Graphein, Geo2DR, Geoopt)
- work on computational chemistry (with applications to drug design/repurposing), such as the Retrosynthesis paper (which won best paper award)
- Applications of GRL for algorithmic reasoning (e.g. Neural Bipartite Matching, planning with neuro-algorithmic policies. and PGNs)
But the obvious standout, not only in the papers but also in most of our invited talks, is the explicit consideration of structure.
I think GRL+ workshop is really cool: it gathers people in GML and discusses ideas that are not fully developed but will be soon. It's like peeking into the crystal ball. Petar Veličković, one of the organizers of this workshop, outlined the following trends:
- Emerging work on performance / scalability (e.g. SIGN, Weisfeiler & Leman go sparse)
- KG embeddings are as strong as ever (e.g. neural multi-hop reasoning, MPQE, Stay Positive, UniKER)
- proposal of many datasets/benchmarks/libraries (Wiki-CS, TUDataset, Spektral, Graphein, Geo2DR, Geoopt)
- work on computational chemistry (with applications to drug design/repurposing), such as the Retrosynthesis paper (which won best paper award)
- Applications of GRL for algorithmic reasoning (e.g. Neural Bipartite Matching, planning with neuro-algorithmic policies. and PGNs)
But the obvious standout, not only in the papers but also in most of our invited talks, is the explicit consideration of structure.
Fresh picks from ArXiv
This week highlights a new knowledge graph about covid-19, applications to program similarity and drug discovery as well as a group of accepted papers to ECCV 20.
GNN
• COVID-19 Knowledge Graph: Accelerating Information Retrieval and Discovery for Scientific Literature
• funcGNN: A Graph Neural Network Approach to Program Similarity
• The expressive power of kth-order invariant graph networks
• Fast Graphlet Transform of Sparse Graphs
• Visualizing Deep Graph Generative Models for Drug Discovery
• Second-Order Pooling for Graph Neural Networks
• Graph-PCNN: Two Stage Human Pose Estimation with Graph Pose Refinement
• Hierachial Protein Function Prediction with Tails-GNNs with Petar Veličković
Conferences
• Multi-view adaptive graph convolutions for graph classification ECCV 20
• Comprehensive Image Captioning via Scene Graph Decomposition ECCV 20
• Differentiable Hierarchical Graph Grouping for Multi-Person Pose Estimation ECCV 20
• Grale: Designing Networks for Graph Learning with Bryan Perozzi, KDD 20
• Edge-aware Graph Representation Learning and Reasoning for Face Parsing ECCV 20
Surveys
• A Survey on Complex Question Answering over Knowledge Base: Recent Advances and Challenges
• A Survey on Graph Neural Networks for Knowledge Graph Completion
This week highlights a new knowledge graph about covid-19, applications to program similarity and drug discovery as well as a group of accepted papers to ECCV 20.
GNN
• COVID-19 Knowledge Graph: Accelerating Information Retrieval and Discovery for Scientific Literature
• funcGNN: A Graph Neural Network Approach to Program Similarity
• The expressive power of kth-order invariant graph networks
• Fast Graphlet Transform of Sparse Graphs
• Visualizing Deep Graph Generative Models for Drug Discovery
• Second-Order Pooling for Graph Neural Networks
• Graph-PCNN: Two Stage Human Pose Estimation with Graph Pose Refinement
• Hierachial Protein Function Prediction with Tails-GNNs with Petar Veličković
Conferences
• Multi-view adaptive graph convolutions for graph classification ECCV 20
• Comprehensive Image Captioning via Scene Graph Decomposition ECCV 20
• Differentiable Hierarchical Graph Grouping for Multi-Person Pose Estimation ECCV 20
• Grale: Designing Networks for Graph Learning with Bryan Perozzi, KDD 20
• Edge-aware Graph Representation Learning and Reasoning for Face Parsing ECCV 20
Surveys
• A Survey on Complex Question Answering over Knowledge Base: Recent Advances and Challenges
• A Survey on Graph Neural Networks for Knowledge Graph Completion
Discovering Symbolic Models in Physical Systems Using Deep Learning
Today (July 29, at 12:00 EDT) will be a zoom lecture about applying GNN to cosmology by Shirley Ho at Physics ∩ ML seminar.
Abstract: We develop a general approach to distill symbolic representations of a learned deep model by introducing strong inductive biases. We focus on Graph Neural Networks (GNNs). The technique works as follows: we first encourage sparse latent representations when we train a GNN in a supervised setting, then we apply symbolic regression to components of the learned model to extract explicit physical relations. We find the correct known equations, including force laws and Hamiltonians, can be extracted from the neural network. We then apply our method to a non-trivial cosmology example—a detailed dark matter simulation—and discover a new analytic formula that can predict the concentration of dark matter from the mass distribution of nearby cosmic structures. The symbolic expressions extracted from the GNN using our technique also generalized to out-of-distribution-data better than the GNN itself. Our approach offers alternative directions for interpreting neural networks and discovering novel physical principles from the representations they learn.
Today (July 29, at 12:00 EDT) will be a zoom lecture about applying GNN to cosmology by Shirley Ho at Physics ∩ ML seminar.
Abstract: We develop a general approach to distill symbolic representations of a learned deep model by introducing strong inductive biases. We focus on Graph Neural Networks (GNNs). The technique works as follows: we first encourage sparse latent representations when we train a GNN in a supervised setting, then we apply symbolic regression to components of the learned model to extract explicit physical relations. We find the correct known equations, including force laws and Hamiltonians, can be extracted from the neural network. We then apply our method to a non-trivial cosmology example—a detailed dark matter simulation—and discover a new analytic formula that can predict the concentration of dark matter from the mass distribution of nearby cosmic structures. The symbolic expressions extracted from the GNN using our technique also generalized to out-of-distribution-data better than the GNN itself. Our approach offers alternative directions for interpreting neural networks and discovering novel physical principles from the representations they learn.
Zoom Video
Join our Cloud HD Video Meeting
Zoom is the leader in modern enterprise video communications, with an easy, reliable cloud platform for video and audio conferencing, chat, and webinars across mobile, desktop, and room systems. Zoom Rooms is the original software-based conference room solution…
Podcast with Michael Bronstein
There is a podcast called This Week in Machine Learning & AI (TWIML) about aspects of AI. Michael Bronstein, head of graph machine learning at Twitter, gave recently a lengthy interview talking about evolution of the field over the last 2 years. He describes current challenges (e.g. scalability), difference between industrial and academic settings for graphs, his recent works as well as prediction of where the area of GML is moving towards.
There is a podcast called This Week in Machine Learning & AI (TWIML) about aspects of AI. Michael Bronstein, head of graph machine learning at Twitter, gave recently a lengthy interview talking about evolution of the field over the last 2 years. He describes current challenges (e.g. scalability), difference between industrial and academic settings for graphs, his recent works as well as prediction of where the area of GML is moving towards.
TWIML
TWIML: The voice of machine learning and artificial intelligence
Intelligent content that gives practitioners, innovators, and leaders an inside look at the present and future of ML & AI technologies.
Main theme from GRL+ workshop
I already mentioned it, but let me add more things about trends in GML (credits to Petar Veličković).
The biggest theme from GRL+ workshop was the explicit consideration of structure, which so far was largely ignored in GNNs (i.e. one would just assume a given graph without thinking how it got there or whether it could be specialized for the task at hand).
In the accepted papers, we have many works which tackle latent structure inference (e.g. Differentiable Graph Module, set2graph, Relate-and-Predict, GFSA, and our PGN are all examples thereof) and also works which attempt to explicitly exploit structure in the data for prediction (e.g. the recent subgraph isomorphism counting paper).
This direction was echoed a lot in our invited talks as well.
Thomas Kipf was talking about relational structure discovery (NRI, CompILE and his recent slot attention work).
Kyle Cranmer was talking about how critical structure discovery is in physics-based applications and inductive biases, highlighting especially his set2graph work as well as their recent work on discovering symbolic representations.
Danai Koutra talking how graphs can be appropriately summarized and how to design GNN layers to deal with heterophily.
Tina Eliassi-Rad gave an amazing lecture-style talk on how topology and structure can be leveraged in machine learning more generally. During our Q&A session, she was asked to give comments on the explosive usage datasets like Cora (as she is one of the authors on the paper that originally proposed Cora, Citeseer etc). And she made a very important 'wakeup call' to GRL folks that we shouldn't think our graphs fall from the sky, and on the topic of using heavy-duty GNN methods and hyperbolic embeddings, etc in the real world, we should always ask the question: 'do we really expect our graphs to be coming from a distribution like this?'.
The videos with all of it should be available in the coming weeks.
I already mentioned it, but let me add more things about trends in GML (credits to Petar Veličković).
The biggest theme from GRL+ workshop was the explicit consideration of structure, which so far was largely ignored in GNNs (i.e. one would just assume a given graph without thinking how it got there or whether it could be specialized for the task at hand).
In the accepted papers, we have many works which tackle latent structure inference (e.g. Differentiable Graph Module, set2graph, Relate-and-Predict, GFSA, and our PGN are all examples thereof) and also works which attempt to explicitly exploit structure in the data for prediction (e.g. the recent subgraph isomorphism counting paper).
This direction was echoed a lot in our invited talks as well.
Thomas Kipf was talking about relational structure discovery (NRI, CompILE and his recent slot attention work).
Kyle Cranmer was talking about how critical structure discovery is in physics-based applications and inductive biases, highlighting especially his set2graph work as well as their recent work on discovering symbolic representations.
Danai Koutra talking how graphs can be appropriately summarized and how to design GNN layers to deal with heterophily.
Tina Eliassi-Rad gave an amazing lecture-style talk on how topology and structure can be leveraged in machine learning more generally. During our Q&A session, she was asked to give comments on the explosive usage datasets like Cora (as she is one of the authors on the paper that originally proposed Cora, Citeseer etc). And she made a very important 'wakeup call' to GRL folks that we shouldn't think our graphs fall from the sky, and on the topic of using heavy-duty GNN methods and hyperbolic embeddings, etc in the real world, we should always ask the question: 'do we really expect our graphs to be coming from a distribution like this?'.
The videos with all of it should be available in the coming weeks.
Telegram
Graph Machine Learning
Trends in GML
I think GRL+ workshop is really cool: it gathers people in GML and discusses ideas that are not fully developed but will be soon. It's like peeking into the crystal ball. Petar Veličković, one of the organizers of this workshop, outlined the…
I think GRL+ workshop is really cool: it gathers people in GML and discusses ideas that are not fully developed but will be soon. It's like peeking into the crystal ball. Petar Veličković, one of the organizers of this workshop, outlined the…
Graph Machine Learning research groups: Kristian Kersting
I do a series of posts on the groups in graph research, previous post is here. The 11th is Kristian Kersting, co-author of TU data set and several graph kernels.
Kristian Kersting (1973)
- Affiliation: TU Darmstadt
- Education: Ph.D. at the University of Freiburg, Germany in 2014 (supervised by Luc De Raedt);
- h-index: 49;
- Awards: best paper at ECML, AAAI; Inaugural German AI Award;
- Interests: graph kernels, graph data sets
I do a series of posts on the groups in graph research, previous post is here. The 11th is Kristian Kersting, co-author of TU data set and several graph kernels.
Kristian Kersting (1973)
- Affiliation: TU Darmstadt
- Education: Ph.D. at the University of Freiburg, Germany in 2014 (supervised by Luc De Raedt);
- h-index: 49;
- Awards: best paper at ECML, AAAI; Inaugural German AI Award;
- Interests: graph kernels, graph data sets
Telegram
Graph Machine Learning
Graph Machine Learning research groups: Jian Tang
I do a series of posts on the groups in graph research, previous post is here. The tenth is Jian Tang. A young professor at Mila, with 4 papers at ICML and 2 at ICLR in 2020, and looking for graduate students…
I do a series of posts on the groups in graph research, previous post is here. The tenth is Jian Tang. A young professor at Mila, with 4 papers at ICML and 2 at ICLR in 2020, and looking for graduate students…
Controlling Fake News using Graphs and Statistics
This is a guest post by Siddharth Bhatia about their recent work with Christos Faloutsos on anomaly detection in streaming data.
MIDAS, Microcluster-Based Detector of Anomalies in Edge Streams (AAAI 2020) uses unsupervised learning to detect anomalies in a streaming manner in real-time. It was designed keeping in mind the way recent sophisticated attacks occur. MIDAS can be used to detect intrusions, Denial of Service (DoS), Distributed Denial of Service (DDoS) attacks, financial fraud and fake ratings.
MIDAS combines a chi-squared goodness-of-fit test with the Count-Min-Sketch (CMS) streaming data structures to get an anomaly score for each edge. It then incorporates temporal and spatial relations to achieve better performance. MIDAS provides theoretical guarantees on the false positives and is three orders of magnitude faster than existing state of the art solutions.
Paper: https://arxiv.org/abs/1911.04464
Code: https://github.com/Stream-AD/MIDAS
This is a guest post by Siddharth Bhatia about their recent work with Christos Faloutsos on anomaly detection in streaming data.
MIDAS, Microcluster-Based Detector of Anomalies in Edge Streams (AAAI 2020) uses unsupervised learning to detect anomalies in a streaming manner in real-time. It was designed keeping in mind the way recent sophisticated attacks occur. MIDAS can be used to detect intrusions, Denial of Service (DoS), Distributed Denial of Service (DDoS) attacks, financial fraud and fake ratings.
MIDAS combines a chi-squared goodness-of-fit test with the Count-Min-Sketch (CMS) streaming data structures to get an anomaly score for each edge. It then incorporates temporal and spatial relations to achieve better performance. MIDAS provides theoretical guarantees on the false positives and is three orders of magnitude faster than existing state of the art solutions.
Paper: https://arxiv.org/abs/1911.04464
Code: https://github.com/Stream-AD/MIDAS
Medium
Controlling Fake News using Graphs and Statistics
Fair Elections, Secure Transactions, and unbiased recommendations?
Graph Machine Learning Newsletter
I finally had time to compose the first issue for my newsletter on graph machine learning. I will be out soon!
Please subscribe and share it with your friends: https://newsletter.ivanovml.com/ (or, in case, it gives a warning here is a backup link: https://www.getrevue.co/profile/graphML).
My hope it will be similar to Ruder's newsletter on NLP, highlighting recent developments, current trends, and upcoming events in GML. I plan to send 1-2 issues per month, so it will be less frequent but more long read about our field.
In case you saw recent blog posts, interviews, conference highlights, industry updates, or anything else worth sharing with a community, don't hesitate to write to me.
I finally had time to compose the first issue for my newsletter on graph machine learning. I will be out soon!
Please subscribe and share it with your friends: https://newsletter.ivanovml.com/ (or, in case, it gives a warning here is a backup link: https://www.getrevue.co/profile/graphML).
My hope it will be similar to Ruder's newsletter on NLP, highlighting recent developments, current trends, and upcoming events in GML. I plan to send 1-2 issues per month, so it will be less frequent but more long read about our field.
In case you saw recent blog posts, interviews, conference highlights, industry updates, or anything else worth sharing with a community, don't hesitate to write to me.
Fresh picks from ArXiv
This week studies architecture search and low latency inference in GNN as well as review on graph signal processing 📶
GNN
• Neural Architecture Search in Graph Neural Networks
• FC-GAGA: Fully Connected Gated Graph Architecture for Spatio-Temporal Traffic Forecasting
• Pooling Regularized Graph Neural Network for fMRI Biomarker Analysis
• GRIP: A Graph Neural Network Accelerator Architecture with Christopher Re
• PyKEEN 1.0: A Python Library for Training and Evaluating Knowledge Graph Embeddings
Math
• A polynomial-time algorithm to determine (almost) Hamiltonicity of dense regular graphs
Surveys
• Graph signal processing for machine learning: A review and new perspectives with Michael Bronstein
This week studies architecture search and low latency inference in GNN as well as review on graph signal processing 📶
GNN
• Neural Architecture Search in Graph Neural Networks
• FC-GAGA: Fully Connected Gated Graph Architecture for Spatio-Temporal Traffic Forecasting
• Pooling Regularized Graph Neural Network for fMRI Biomarker Analysis
• GRIP: A Graph Neural Network Accelerator Architecture with Christopher Re
• PyKEEN 1.0: A Python Library for Training and Evaluating Knowledge Graph Embeddings
Math
• A polynomial-time algorithm to determine (almost) Hamiltonicity of dense regular graphs
Surveys
• Graph signal processing for machine learning: A review and new perspectives with Michael Bronstein
Videos for ICML 2020 workshops and tutorials
Available at slideslive.
Two related to GML are:
• GRL+ workshop
• Bridge Between Perception and Reasoning: Graph Neural Networks & Beyond workshop
Available at slideslive.
Two related to GML are:
• GRL+ workshop
• Bridge Between Perception and Reasoning: Graph Neural Networks & Beyond workshop
SlidesLive
ICML 2020
The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning.
ICML is globally renowned for presenting and publishing cutting…
ICML is globally renowned for presenting and publishing cutting…