ArtificialIntelligenceArticles
2.97K subscribers
1.64K photos
9 videos
5 files
3.86K links
for who have a passion for -
1. #ArtificialIntelligence
2. Machine Learning
3. Deep Learning
4. #DataScience
5. #Neuroscience

6. #ResearchPapers

7. Related Courses and Ebooks
Download Telegram
Head animation from single shot by #SamsungAI team

Samsung researchers have released a model that can generate faces in new poses from just a single image/frame (for each of face, pose). Done by building a well-trained landmark model in advance & one-shotting from that, using keypoints, adaptive instance norms and GANs. Model performs no 3D face modelling!

ArXiV: https://arxiv.org/abs/1905.08233v1
Youtube: https://www.youtube.com/watch?v=p1b5aiTrGzY

#GAN #CV #DL
From Planck Area to Graph Theory: Topologically Distinct Black Hole Microstates. arxiv.org/abs/1907.03090
Machine Learning for Everyone.

The best general intro post about Machine Learning, covering everything you need to know not to get overxcited about SkyNet and to get general understanding of all #ML / #AI hype. You can surely save this post into «Saved messages» and forward it to your friends to make them familiar with the subject

Link: https://vas3k.com/blog/machine_learning/

#entrylevel #novice #general
New deep learning framework from Facebook

Pythia is a deep learning framework that supports multitasking in the vision and language domain. Built on our open-source #PyTorch framework, the modular, plug-and-play design enables researchers to quickly build, reproduce, and benchmark AI models. #Pythia is designed for vision and language tasks, such as answering questions related to visual data and automatically generating image captions.

Link: https://code.fb.com/ai-research/pythia/
GitHub: https://github.com/facebookresearch/pythia

#Facebook #FacebookAI #DL #CV #multimodal
A Recipe for Training Neural Networks by Andrej Karpathy

New article written by Andrej Karpathy distilling a bunch of useful heuristics for training neural nets. The post is full of real-world knowledge and how-to details that are not taught in books and often take endless hours to learn the hard way.

Link: https://karpathy.github.io/2019/04/25/recipe/

#tipsandtricks #karpathy #tutorial #nn #ml #dl
Awesome new paper from FAIR:
1. A new type of large-scale memory layer that uses product keys (FAISS-like indexing with product quantization)
2. Replace some layers in a BERT-like architecture by these Product Key Memory layers.
.....
3. PROFIT: better perplexity than BERT for half the computation.

NLP tasks require lots of memory. This is a good way to give loads of memory to a neural net while making it computationally practical by making it sparsely activated. PKM layers can be seen as having a sort of "winners take all" competition to sparsify the activities.


https://arxiv.org/abs/1907.05242
Check-out book:

Generative Adversarial Networks with Python https://machinelearningmastery.com/generative_adversarial_networks/