Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn, Pieter Abbeel, Sergey Levine : https://arxiv.org/abs/1703.03400
#MachineLearning #ArtificialIntelligence #EvolutionaryComputing
Chelsea Finn, Pieter Abbeel, Sergey Levine : https://arxiv.org/abs/1703.03400
#MachineLearning #ArtificialIntelligence #EvolutionaryComputing
arXiv.org
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
We propose an algorithm for meta-learning that is model-agnostic, in the sense that it is compatible with any model trained with gradient descent and applicable to a variety of different learning...
Tensorflow implementation of U-GAT-IT
Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
GitHub, by Junho Kim : https://github.com/taki0112/UGATIT
#tensorflow #unsupervisedlearning #generativemodels
Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
GitHub, by Junho Kim : https://github.com/taki0112/UGATIT
#tensorflow #unsupervisedlearning #generativemodels
GitHub
GitHub - taki0112/UGATIT: Official Tensorflow implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive…
Official Tensorflow implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation (ICLR 2020) - taki0112/UGATIT
A Tensorflow 2.0 library for deep learning model interpretability
Blog by Raphaël Meudec : https://blog.sicara.com/tf-explain-interpretability-tensorflow-2-9438b5846e35
#MachineLearning #DeepLearning #TensorFlow #Interpretability
Blog by Raphaël Meudec : https://blog.sicara.com/tf-explain-interpretability-tensorflow-2-9438b5846e35
#MachineLearning #DeepLearning #TensorFlow #Interpretability
www.sicara.ai
Introducing tf-explain, Interpretability for TensorFlow 2.0
Understanding deep networks is crucial for model development and user adoption. tf-explain offers interpretability methods to gain insight on your network.
An artificial neural network called “EmoNet” could recognize which emotions, out of 20 different categories, a human would feel in response to an image, challenging the prevailing view that emotions are independent from the sensory environment.
Read the research in our open-access journal, Science Advances: https://fcld.ly/x5rbro1
Read the research in our open-access journal, Science Advances: https://fcld.ly/x5rbro1
Science
Emotion schemas are embedded in the human visual system
Theorists have suggested that emotions are canonical responses to situations ancestrally linked to survival. If so, then emotions may be afforded by features of the sensory environment. However, few computational models describe how combinations of stimulus…
Emotion schemas are embedded in the human visual system
https://advances.sciencemag.org/content/advances/5/7/eaaw4358.full.pdf
https://advances.sciencemag.org/content/advances/5/7/eaaw4358.full.pdf
If you didn't know, there is a collection of datasets, ready to use with TF
https://github.com/tensorflow/datasets
https://github.com/tensorflow/datasets
GitHub
GitHub - tensorflow/datasets: TFDS is a collection of datasets ready to use with TensorFlow, Jax, ...
TFDS is a collection of datasets ready to use with TensorFlow, Jax, ... - tensorflow/datasets
ImageNet-trained deep neural network exhibits illusion-like response to the Scintillating Grid
https://arxiv.org/abs/1907.09019v1
https://arxiv.org/abs/1907.09019v1
arXiv.org
ImageNet-trained deep neural network exhibits illusion-like...
Deep neural network (DNN) models for computer vision are now capable of
human-level object recognition. Consequently, similarities in the performance
and vulnerabilities of DNN and human vision...
human-level object recognition. Consequently, similarities in the performance
and vulnerabilities of DNN and human vision...
Deep Non-Rigid Structure from Motion. arxiv.org/abs/1907.13123
Multi-Frame Cross-Entropy Training for Convolutional Neural Networks in Speech Recognition. arxiv.org/abs/1907.13121
Improved mutual information measure for classification and community detection. arxiv.org/abs/1907.12581
Quadtree Generating Networks: Efficient Hierarchical Scene Parsing with Sparse Convolutions. arxiv.org/abs/1907.11821
Many AI experts believe humanlike artificial general intelligence (AGI) is but a far-fetched dream, while others find their inspiration in the quest for AGI. Speaking at last November’s AI Frontiers Conference, OpenAI Founder and Research Director Ilya Sutskever said “We (OpenAI) have reviewed progress in the field over the past few years. Our conclusion is near-term AGI should be taken as a serious possibility.”
Today, respected scientific journal Nature boosted the case for AGI with a cover story on a new research paper, Towards artificial general intelligence with hybrid Tianjic chip architecture, which aims to stimulate AGI development by adopting generalized hardware platforms.
Typically, researchers have taken one of two paths in pursuit of AGI — proceeding either via computer science or via neuroscience. Each approach however requires its own unique and incompatible platforms, and this has stalled overarching AGI research and development. With an eye on closing that gap, researchers from Tsinghua University, Beijing Lynxi Technology, Beijing Normal University, Singapore Polytechnic University and University of California Santa Barbara have introduced the Tianjic chip. The revolutionary chip can adopt various core architectures, reconfigurable building blocks and so on, to accommodate both computer-science-based machine-learning algorithms and neuroscience-oriented schemes such as brain-inspired circuits.
A key innovation from the research team is Tianjic’s unified function core (FCore) which combines essential building blocks for both artificial neural networks and biologically networks — axon, synapse, dendrite and soma blocks. The 28-nm chip consists of 156 FCores, containing approximately 40,000 neurons and 10 million synapses in an area of 3.8×3.8 mm2.
Tianjic delivers an internal memory bandwidth of more than 610 gigabytes (GB) per second, and a peak performance of 1.28 tera operations per second (TOPS) per watt for running artificial neural networks. In the biologically-inspired spiking neural network mode, Tianjic achieves a peak performance of about 650 giga synaptic operations per second (GSOPS) per watt. The research team also showcased the superior performance of Tianjic compared to GPU, where the new chip achieves 1.6 – 100 times better throughput and 12 – 10000 times better power efficiency.
The research team designed a self-driving bicycle experiment to evaluate the chip’s capability for integrating multimodal information and making prompt decisions. Equipped with the Tianjic chip and IMU sensor, a camera, steering motor, driving motor, speed motor and battery, the bicycle was tasked with performing functions such as real-time object detection, tracking, voice-command recognition, riding over a speed bump, obstacle avoidance, balance control and decision making.
The research team developed a variety of neural networks (CNN, CANN, SNN and MLP networks) to enable each task. The models were pretrained and programmed onto the Tianjic chip, which can process the models in parallel and enable seamless on-chip communication across different models.
In experiments, the Tianjic-powered bicycle smoothly performed all assigned tasks, signaling a huge leap towards the acceleration of AGI development.
The research team also noted that “high spatiotemporal complexity can be generated by randomly introducing new variables into the environment in real time, such as different road conditions, noises, weather factors, multiple languages, more people and so on. By exploring solutions that allow adaptation to these environmental changes, issues critical to AGI — such as generalization, robustness and autonomous learning — can be examined.”
The research team told Chinese media they expect the Tianjic chip to be deployed in autonomous vehicles and smart robots. They have already started research on the next-generation chips and expect to close the R&D stage early next year.
Today, respected scientific journal Nature boosted the case for AGI with a cover story on a new research paper, Towards artificial general intelligence with hybrid Tianjic chip architecture, which aims to stimulate AGI development by adopting generalized hardware platforms.
Typically, researchers have taken one of two paths in pursuit of AGI — proceeding either via computer science or via neuroscience. Each approach however requires its own unique and incompatible platforms, and this has stalled overarching AGI research and development. With an eye on closing that gap, researchers from Tsinghua University, Beijing Lynxi Technology, Beijing Normal University, Singapore Polytechnic University and University of California Santa Barbara have introduced the Tianjic chip. The revolutionary chip can adopt various core architectures, reconfigurable building blocks and so on, to accommodate both computer-science-based machine-learning algorithms and neuroscience-oriented schemes such as brain-inspired circuits.
A key innovation from the research team is Tianjic’s unified function core (FCore) which combines essential building blocks for both artificial neural networks and biologically networks — axon, synapse, dendrite and soma blocks. The 28-nm chip consists of 156 FCores, containing approximately 40,000 neurons and 10 million synapses in an area of 3.8×3.8 mm2.
Tianjic delivers an internal memory bandwidth of more than 610 gigabytes (GB) per second, and a peak performance of 1.28 tera operations per second (TOPS) per watt for running artificial neural networks. In the biologically-inspired spiking neural network mode, Tianjic achieves a peak performance of about 650 giga synaptic operations per second (GSOPS) per watt. The research team also showcased the superior performance of Tianjic compared to GPU, where the new chip achieves 1.6 – 100 times better throughput and 12 – 10000 times better power efficiency.
The research team designed a self-driving bicycle experiment to evaluate the chip’s capability for integrating multimodal information and making prompt decisions. Equipped with the Tianjic chip and IMU sensor, a camera, steering motor, driving motor, speed motor and battery, the bicycle was tasked with performing functions such as real-time object detection, tracking, voice-command recognition, riding over a speed bump, obstacle avoidance, balance control and decision making.
The research team developed a variety of neural networks (CNN, CANN, SNN and MLP networks) to enable each task. The models were pretrained and programmed onto the Tianjic chip, which can process the models in parallel and enable seamless on-chip communication across different models.
In experiments, the Tianjic-powered bicycle smoothly performed all assigned tasks, signaling a huge leap towards the acceleration of AGI development.
The research team also noted that “high spatiotemporal complexity can be generated by randomly introducing new variables into the environment in real time, such as different road conditions, noises, weather factors, multiple languages, more people and so on. By exploring solutions that allow adaptation to these environmental changes, issues critical to AGI — such as generalization, robustness and autonomous learning — can be examined.”
The research team told Chinese media they expect the Tianjic chip to be deployed in autonomous vehicles and smart robots. They have already started research on the next-generation chips and expect to close the R&D stage early next year.
Further information can be found in the paper Towards artificial general intelligence with hybrid Tianjic chip architecture.
https://syncedreview.com/2019/07/31/nature-cover-story-chinese-teams-tianjic-chip-bridges-machine-learning-and-neuroscience-in-pursuit-of-agi/
https://syncedreview.com/2019/07/31/nature-cover-story-chinese-teams-tianjic-chip-bridges-machine-learning-and-neuroscience-in-pursuit-of-agi/
Synced
Nature Cover Story | Chinese Team’s ‘Tianjic Chip’ Bridges Machine Learning and Neuroscience in Pursuit of AGI
Today, respected scientific journal Nature boosted the case for AGI with a cover story on a new research paper, Towards artificial general intelligence with hybrid Tianjic chip architecture, which …
Modeling question asking using neural program generation
Ziyun Wang and Brenden M. Lake : https://arxiv.org/abs/1907.09899
#artificialintelligence #naturallanguageprocessing #reinforcementlearning
Ziyun Wang and Brenden M. Lake : https://arxiv.org/abs/1907.09899
#artificialintelligence #naturallanguageprocessing #reinforcementlearning
Landmark Detection in Low Resolution Faces with Semi-Supervised Learning. arxiv.org/abs/1907.13255
Wasserstein Robust Reinforcement Learning. arxiv.org/abs/1907.13196
AI-GAs: AI-generating algorithms, an alternate paradigm for producing general artificial intelligence
"We should keep in mind the grandeur of the task we are discussing, which is nothing short than the creation of an artificial intelligence smarter than humans. If we succeed, we arguably have also created life itself..."
By Jeff Clune : https://arxiv.org/abs/1905.10985
#ArtificialIntelligence #ArtificialGeneralIntelligence #MetaLearning
"We should keep in mind the grandeur of the task we are discussing, which is nothing short than the creation of an artificial intelligence smarter than humans. If we succeed, we arguably have also created life itself..."
By Jeff Clune : https://arxiv.org/abs/1905.10985
#ArtificialIntelligence #ArtificialGeneralIntelligence #MetaLearning
DeepMind: Using AI to give doctors a 48-hour head start on life-threatening illness
Blog: https://deepmind.com/blog/predicting-patient-deterioration/
Paper: https://www.nature.com/articles/s41586-019-1390-1
Blog: https://deepmind.com/blog/predicting-patient-deterioration/
Paper: https://www.nature.com/articles/s41586-019-1390-1
Deepmind
Using AI to give doctors a 48-hour head start on life-threatening illness
Artificial intelligence can now predict one of the leading causes of avoidable patient harm up to two days before it happens, as demonstrated by our latest research published in Nature.