Top 8 trends from ICLR 2019
Overview of trends on #ICLR2019:
1. Inclusivity
2. Unsupervised representation learning & transfer learning
3. Retro ML
4. RNN is losing its luster with researchers
5. GANs are still going on strong
6. The lack of biologically inspired deep learning
7. Reinforcement learning is still the most popular topic by submissions
8. Most accepted papers will be quickly forgotten
Link: https://huyenchip.com/2019/05/12/top-8-trends-from-iclr-2019.html
Overview of trends on #ICLR2019:
1. Inclusivity
2. Unsupervised representation learning & transfer learning
3. Retro ML
4. RNN is losing its luster with researchers
5. GANs are still going on strong
6. The lack of biologically inspired deep learning
7. Reinforcement learning is still the most popular topic by submissions
8. Most accepted papers will be quickly forgotten
Link: https://huyenchip.com/2019/05/12/top-8-trends-from-iclr-2019.html
Huyenchip
Top 8 trends from ICLR 2019
[Twitter thread] Disclaimer: This post doesn’t reflect the view of any of the organizations I’m associated with and is probably peppered with my personal and...
Paper on RoBERTa, the current holder of the pole position on the GLUE leaderboard ( https://gluebenchmark.com/leaderboard/ )
https://arxiv.org/abs/1907.11692
https://arxiv.org/abs/1907.11692
Gluebenchmark
GLUE Benchmark
The General Language Understanding Evaluation (GLUE) benchmark is a collection of resources for training, evaluating, and analyzing natural language understanding systems
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Liu et al.: https://arxiv.org/abs/1907.11692
#bert #naturallanguageprocessing #unsupervisedlearning
Liu et al.: https://arxiv.org/abs/1907.11692
#bert #naturallanguageprocessing #unsupervisedlearning
arXiv.org
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Language model pretraining has led to significant performance gains but careful comparison between different approaches is challenging. Training is computationally expensive, often done on private...
Now AI can also be used to identify fake text.
These researchers have released a tool called has GLTR - Giant Language Model Test Room
1) Test yourself here GLTR tool page: https://gltr.io/
2) GitHub link: https://github.com/HendrikStrobelt/detecting-fake-text
3) Paper link: https://arxiv.org/pdf/1906.04043.pdf
https://t.iss.one/ArtificialIntelligenceArticles
These researchers have released a tool called has GLTR - Giant Language Model Test Room
1) Test yourself here GLTR tool page: https://gltr.io/
2) GitHub link: https://github.com/HendrikStrobelt/detecting-fake-text
3) Paper link: https://arxiv.org/pdf/1906.04043.pdf
https://t.iss.one/ArtificialIntelligenceArticles
Deep Learning with MATLAB: Transfer Learning in 10 Lines of MATLAB Code
https://nl.mathworks.com/videos/deep-learning-with-matlab-transfer-learning-in-10-lines-of-matlab-code-1487714838381.html
https://nl.mathworks.com/videos/deep-learning-with-matlab-transfer-learning-in-10-lines-of-matlab-code-1487714838381.html
Mathworks
Deep Learning with MATLAB: Transfer Learning in 10 Lines of MATLAB Code Video
"Learn how to use transfer learning in MATLAB to re-train deep learning networks created by experts for your own data or task. "
Machine Learning Top 10 Articles for the Past Month (v.July 2019)
https://medium.com/@Mybridge/machine-learning-top-10-articles-for-the-past-month-v-july-2019-178436f99201 https://t.iss.one/ArtificialIntelligenceArticles
https://medium.com/@Mybridge/machine-learning-top-10-articles-for-the-past-month-v-july-2019-178436f99201 https://t.iss.one/ArtificialIntelligenceArticles
Low-Rank Matrix Completion: A Contemporary Survey. arxiv.org/abs/1907.11705
DEAM: Accumulated Momentum with Discriminative Weight for Stochastic Optimization. arxiv.org/abs/1907.11307
Recurrent Aggregation Learning for Multi-View Echocardiographic Sequences Segmentation. arxiv.org/abs/1907.11292
Yann LeCun :
"EGG is a new toolkit that allows researchers and developers to quickly create game simulations in which two neural network agents devise their own discrete communication system in order to solve a task together."
https://code.fb.com/ai-research/egg-toolkit/
https://t.iss.one/ArtificialIntelligenceArticles
"EGG is a new toolkit that allows researchers and developers to quickly create game simulations in which two neural network agents devise their own discrete communication system in order to solve a task together."
https://code.fb.com/ai-research/egg-toolkit/
https://t.iss.one/ArtificialIntelligenceArticles
Facebook Engineering
EGG: A toolkit for language emergence simulations - Facebook Engineering
EGG is a toolkit that allows researchers to create game simulations in which two agents devise their own communication system to solve a task together.
The ACM Turing Lecture on "The #DeepLearning Revolution" is now available
Presented by each winner on the topic of their choice at a forum of their choice in the year they received the ACM A.M. Turing Award https://amturing.acm.org/lectures.cfm
#ArtificialIntelligence #SelfSupervisedLearning
Presented by each winner on the topic of their choice at a forum of their choice in the year they received the ACM A.M. Turing Award https://amturing.acm.org/lectures.cfm
#ArtificialIntelligence #SelfSupervisedLearning
Tensorflow implementation of U-GAT-IT
Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
GitHub, by Junho Kim : https://github.com/taki0112/UGATIT
#tensorflow #unsupervisedlearning #generativemodels
Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
GitHub, by Junho Kim : https://github.com/taki0112/UGATIT
#tensorflow #unsupervisedlearning #generativemodels
GitHub
GitHub - taki0112/UGATIT: Official Tensorflow implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive…
Official Tensorflow implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation (ICLR 2020) - taki0112/UGATIT
ConvNet Playground
An interactive visualization tool for exploring Convolutional #NeuralNetworks applied to the task of semantic image search.
Created by Victor Dibia at Fast Forward Labs : https://convnetplayground.fastforwardlabs.com/#/
#datavisualization #deeplearning #machinelearning
An interactive visualization tool for exploring Convolutional #NeuralNetworks applied to the task of semantic image search.
Created by Victor Dibia at Fast Forward Labs : https://convnetplayground.fastforwardlabs.com/#/
#datavisualization #deeplearning #machinelearning
ConvNet Playground
An Interactive Visualization for exploring Convolutional Neural Networks applied to the task of semantic image search. A prototype built by Cloudera Fast Forward Labs.
Implicit Generation and Generalization in Energy-Based Models
Yilun Du and Igor Mordatch : https://arxiv.org/abs/1903.08689
#EnergyBasedModels #MachineLearning #GenerativeModels
Yilun Du and Igor Mordatch : https://arxiv.org/abs/1903.08689
#EnergyBasedModels #MachineLearning #GenerativeModels
arXiv.org
Implicit Generation and Generalization in Energy-Based Models
Energy based models (EBMs) are appealing due to their generality and simplicity in likelihood modeling, but have been traditionally difficult to train. We present techniques to scale MCMC based...
OpenAI’s GPT-2: A Simple Guide to Build the World’s Most Advanced Text Generator in Python
https://www.analyticsvidhya.com/blog/2019/07/openai-gpt2-text-generator-python/
https://www.analyticsvidhya.com/blog/2019/07/openai-gpt2-text-generator-python/
Analytics Vidhya
OpenAI's GPT-2: A Simple Guide to Build the World's Most Advanced Text Generator in Python
OpenAI’s GPT-2 is the world’s most advanced framework for NLP tasks in Python. Build your own GPT-2 AI text generator in Python.
Quantum Computers, Neural Computers, and the future of 0s and 1s.
https://medium.com/@normandipalo/quantum-computers-neural-computers-and-the-future-of-0s-and-1s-ac790ca1f5e4
https://medium.com/@normandipalo/quantum-computers-neural-computers-and-the-future-of-0s-and-1s-ac790ca1f5e4
Medium
Quantum Computers, Neural Computers, and the future of 0s and 1s.
A glimpse into what the future of computing may look like.