A Gentle Introduction to Monte Carlo Sampling for Probability
https://machinelearningmastery.com/monte-carlo-sampling-for-probability/
https://machinelearningmastery.com/monte-carlo-sampling-for-probability/
MachineLearningMastery.com
A Gentle Introduction to Monte Carlo Sampling for Probability - MachineLearningMastery.com
Monte Carlo methods are a class of techniques for randomly sampling a probability distribution.
There are many problem domains where describing or estimating the probability distribution is relatively straightforward, but calculating a desired quantity…
There are many problem domains where describing or estimating the probability distribution is relatively straightforward, but calculating a desired quantity…
A minimalist neural machine translation toolkit based on PyTorch that is specifically designed for novices.
https://arxiv.org/abs/1907.12484
https://github.com/joeynmt/joeynmt
https://arxiv.org/abs/1907.12484
https://github.com/joeynmt/joeynmt
arXiv.org
Joey NMT: A Minimalist NMT Toolkit for Novices
We present Joey NMT, a minimalist neural machine translation toolkit based on
PyTorch that is specifically designed for novices. Joey NMT provides many
popular NMT features in a small and simple...
PyTorch that is specifically designed for novices. Joey NMT provides many
popular NMT features in a small and simple...
The Visual Task Adaptation Benchmark
https://ai.googleblog.com/2019/11/the-visual-task-adaptation-benchmark.html
https://ai.googleblog.com/2019/11/the-visual-task-adaptation-benchmark.html
Googleblog
The Visual Task Adaptation Benchmark
Abstraction and Reasoning Corpus
Paper: https://arxiv.org/abs/1911.01547
ARC: https://github.com/fchollet/ARC
Paper: https://arxiv.org/abs/1911.01547
ARC: https://github.com/fchollet/ARC
arXiv.org
On the Measure of Intelligence
To make deliberate progress towards more intelligent and more human-like artificial systems, we need to be following an appropriate feedback signal: we need to be able to define and evaluate...
This project is adapted from the original Dive Into Deep Learning book
https://github.com/dsgiitr/d2l-pytorch
https://github.com/dsgiitr/d2l-pytorch
GitHub
GitHub - dsgiitr/d2l-pytorch: This project reproduces the book Dive Into Deep Learning (https://d2l.ai/), adapting the code from…
This project reproduces the book Dive Into Deep Learning (https://d2l.ai/), adapting the code from MXNet into PyTorch. - dsgiitr/d2l-pytorch
A Gentle Introduction to Maximum a Posteriori (MAP) for Machine Learning
https://machinelearningmastery.com/maximum-a-posteriori-estimation/
https://machinelearningmastery.com/maximum-a-posteriori-estimation/
MachineLearningMastery.com
A Gentle Introduction to Maximum a Posteriori (MAP) for Machine Learning - MachineLearningMastery.com
Density estimation is the problem of estimating the probability distribution for a sample of observations from a problem domain.
Typically, estimating the entire distribution is intractable, and instead, we are happy to have the expected value of the distribution…
Typically, estimating the entire distribution is intractable, and instead, we are happy to have the expected value of the distribution…
14 Different Types of Learning in Machine Learning
https://machinelearningmastery.com/types-of-learning-in-machine-learning/
https://machinelearningmastery.com/types-of-learning-in-machine-learning/
MachineLearningMastery.com
14 Different Types of Learning in Machine Learning - MachineLearningMastery.com
Machine learning is a large field of study that overlaps with and inherits ideas from many related fields such as artificial intelligence. The focus of the field is learning, that is, acquiring skills or knowledge from experience. Most commonly, this means…
A Multimodal Language Dataset for Understanding Humor
article: https://arxiv.org/pdf/1904.06618.pdf
dataset: https://github.com/ROC-HCI/UR-FUNNY
article: https://arxiv.org/pdf/1904.06618.pdf
dataset: https://github.com/ROC-HCI/UR-FUNNY
GitHub
GitHub - ROC-HCI/UR-FUNNY: This repository presents UR-FUNNY dataset: first dataset for multimodal humor detection
This repository presents UR-FUNNY dataset: first dataset for multimodal humor detection - GitHub - ROC-HCI/UR-FUNNY: This repository presents UR-FUNNY dataset: first dataset for multimodal humor...
Green AI vs Red AI
https://arxiv.org/abs/1907.10597
Tackling Climate Change with Machine Learning
https://www.reddit.com/r/MachineLearning/comments/da30mv/r_tackling_climate_change_with_machine_learning/
https://arxiv.org/abs/1907.10597
Tackling Climate Change with Machine Learning
https://www.reddit.com/r/MachineLearning/comments/da30mv/r_tackling_climate_change_with_machine_learning/
arXiv.org
Green AI
The computations required for deep learning research have been doubling every few months, resulting in an estimated 300,000x increase from 2012 to 2018 [2]. These computations have a surprisingly...
How to Save a NumPy Array to File for Machine Learning
https://machinelearningmastery.com/how-to-save-a-numpy-array-to-file-for-machine-learning/
https://machinelearningmastery.com/how-to-save-a-numpy-array-to-file-for-machine-learning/
MachineLearningMastery.com
How to Save a NumPy Array to File for Machine Learning - MachineLearningMastery.com
Developing machine learning models in Python often requires the use of NumPy arrays.
NumPy arrays are efficient data structures for working with data in Python, and machine learning models like those in the scikit-learn library, and deep learning models…
NumPy arrays are efficient data structures for working with data in Python, and machine learning models like those in the scikit-learn library, and deep learning models…
Sparse Networks from Scratch: Faster Training without Losing Performance
https://arxiv.org/abs/1907.04840
https://timdettmers.com/2019/07/11/sparse-networks-from-scratch/
Sparse Learning Library and Sparse Momentum Resources
https://github.com/TimDettmers/sparse_learning
https://arxiv.org/abs/1907.04840
https://timdettmers.com/2019/07/11/sparse-networks-from-scratch/
Sparse Learning Library and Sparse Momentum Resources
https://github.com/TimDettmers/sparse_learning
arXiv.org
Sparse Networks from Scratch: Faster Training without Losing Performance
We demonstrate the possibility of what we call sparse learning: accelerated training of deep neural networks that maintain sparse weights throughout training while achieving dense performance...
Introducing the Next Generation of On-Device Vision Models: MobileNetV3 and MobileNetEdgeTPU
https://ai.googleblog.com/2019/11/introducing-next-generation-on-device.html
https://ai.googleblog.com/2019/11/introducing-next-generation-on-device.html
Googleblog
Introducing the Next Generation of On-Device Vision Models: MobileNetV3 and MobileNetEdgeTPU
Research Guide: Pruning Techniques for Neural Networks
https://heartbeat.fritz.ai/research-guide-pruning-techniques-for-neural-networks-d9b8440ab10d
https://heartbeat.fritz.ai/research-guide-pruning-techniques-for-neural-networks-d9b8440ab10d
Medium
Research Guide: Pruning Techniques for Neural Networks
[Nearly] Everything you need to know in 2019
Media is too big
VIEW IN TELEGRAM
Sberbank's subsidiary Cloud Technologies (provides cloud services under the SberCloud brand) showed the most powerful russian supercomputer Christofari.
Power of the supercomputer is 6.67 penaflops (about 6.7 quadrillion operations per second). So Christofari be in the TOP-30 of the world rating.Access will be available for all AI Cloud subscribers. The cost of usage per min on a full power - 5750 RUB (about $90).
Power of the supercomputer is 6.67 penaflops (about 6.7 quadrillion operations per second). So Christofari be in the TOP-30 of the world rating.Access will be available for all AI Cloud subscribers. The cost of usage per min on a full power - 5750 RUB (about $90).
Sharing our Experience Upgrading OpenNMT to TensorFlow 2.0
https://blog.tensorflow.org/2019/11/our-experience-upgrading-OpenNMT-to-TensorFlow.html
code: https://github.com/OpenNMT/OpenNMT-tf
OpenNMT: https://opennmt.net/
https://blog.tensorflow.org/2019/11/our-experience-upgrading-OpenNMT-to-TensorFlow.html
code: https://github.com/OpenNMT/OpenNMT-tf
OpenNMT: https://opennmt.net/
blog.tensorflow.org
Sharing our Experience Upgrading OpenNMT to TensorFlow 2.0
OpenNMT-tf is a neural machine translation toolkit for TensorFlow released in 2017. At that time, the project used many features and capabilities offered by TensorFlow: training and evaluation with tf.estimator, variable scopes, graph collections, tf.contrib…
How to Connect Model Input Data With Predictions for Machine Learning
https://machinelearningmastery.com/how-to-connect-model-input-data-with-predictions-for-machine-learning/
https://machinelearningmastery.com/how-to-connect-model-input-data-with-predictions-for-machine-learning/
MachineLearningMastery.com
How to Connect Model Input Data With Predictions for Machine Learning - MachineLearningMastery.com
Fitting a model to a training dataset is so easy today with libraries like scikit-learn.
A model can be fit and evaluated on a dataset in just a few lines of code. It is so easy that it has become a problem.
The same few lines of code are repeated again…
A model can be fit and evaluated on a dataset in just a few lines of code. It is so easy that it has become a problem.
The same few lines of code are repeated again…
Stacked Capsule Autoencoders
https://github.com/google-research/google-research/tree/master/stacked_capsule_autoencoders
paper : https://arxiv.org/abs/1906.06818
https://akosiorek.github.io/ml/2019/06/23/stacked_capsule_autoencoders.html
https://github.com/google-research/google-research/tree/master/stacked_capsule_autoencoders
paper : https://arxiv.org/abs/1906.06818
https://akosiorek.github.io/ml/2019/06/23/stacked_capsule_autoencoders.html
GitHub
google-research/stacked_capsule_autoencoders at master · google-research/google-research
Google Research. Contribute to google-research/google-research development by creating an account on GitHub.
What Does Stochastic Mean in Machine Learning?
https://machinelearningmastery.com/stochastic-in-machine-learning/
https://machinelearningmastery.com/stochastic-in-machine-learning/
DeepFovea: Using deep learning for foveated reconstruction in AR/VR
https://ai.facebook.com/blog/deepfovea-using-deep-learning-for-foveated-reconstruction-in-ar-vr/
code: https://github.com/facebookresearch/DeepFovea
full paper: https://research.fb.com/publications/deepfovea-neural-reconstruction-for-foveated-rendering-and-video-compression-using-learned-statistics-of-natural-videos/
@ai_machinelearning_big_data
https://ai.facebook.com/blog/deepfovea-using-deep-learning-for-foveated-reconstruction-in-ar-vr/
code: https://github.com/facebookresearch/DeepFovea
full paper: https://research.fb.com/publications/deepfovea-neural-reconstruction-for-foveated-rendering-and-video-compression-using-learned-statistics-of-natural-videos/
@ai_machinelearning_big_data
Facebook
DeepFovea: Using deep learning for foveated reconstruction in AR/VR
We are making available the DeepFovea network architecture, a new state of the art in foveated rendering for augmented and virtual reality using an AI-powered system.
RecSim: A Configurable Simulation Platform for Recommender Systems
https://ai.googleblog.com/2019/11/recsim-configurable-simulation-platform.html
article: https://arxiv.org/abs/1909.04847
github: https://github.com/google-research/recsim
https://ai.googleblog.com/2019/11/recsim-configurable-simulation-platform.html
article: https://arxiv.org/abs/1909.04847
github: https://github.com/google-research/recsim
Googleblog
RecSim: A Configurable Simulation Platform for Recommender Systems