ArtificialIntelligenceArticles
2.96K subscribers
1.64K photos
9 videos
5 files
3.86K links
for who have a passion for -
1. #ArtificialIntelligence
2. Machine Learning
3. Deep Learning
4. #DataScience
5. #Neuroscience

6. #ResearchPapers

7. Related Courses and Ebooks
Download Telegram
From Planck Area to Graph Theory: Topologically Distinct Black Hole Microstates. arxiv.org/abs/1907.03090
Machine Learning for Everyone.

The best general intro post about Machine Learning, covering everything you need to know not to get overxcited about SkyNet and to get general understanding of all #ML / #AI hype. You can surely save this post into «Saved messages» and forward it to your friends to make them familiar with the subject

Link: https://vas3k.com/blog/machine_learning/

#entrylevel #novice #general
New deep learning framework from Facebook

Pythia is a deep learning framework that supports multitasking in the vision and language domain. Built on our open-source #PyTorch framework, the modular, plug-and-play design enables researchers to quickly build, reproduce, and benchmark AI models. #Pythia is designed for vision and language tasks, such as answering questions related to visual data and automatically generating image captions.

Link: https://code.fb.com/ai-research/pythia/
GitHub: https://github.com/facebookresearch/pythia

#Facebook #FacebookAI #DL #CV #multimodal
A Recipe for Training Neural Networks by Andrej Karpathy

New article written by Andrej Karpathy distilling a bunch of useful heuristics for training neural nets. The post is full of real-world knowledge and how-to details that are not taught in books and often take endless hours to learn the hard way.

Link: https://karpathy.github.io/2019/04/25/recipe/

#tipsandtricks #karpathy #tutorial #nn #ml #dl
Awesome new paper from FAIR:
1. A new type of large-scale memory layer that uses product keys (FAISS-like indexing with product quantization)
2. Replace some layers in a BERT-like architecture by these Product Key Memory layers.
.....
3. PROFIT: better perplexity than BERT for half the computation.

NLP tasks require lots of memory. This is a good way to give loads of memory to a neural net while making it computationally practical by making it sparsely activated. PKM layers can be seen as having a sort of "winners take all" competition to sparsify the activities.


https://arxiv.org/abs/1907.05242
Check-out book:

Generative Adversarial Networks with Python https://machinelearningmastery.com/generative_adversarial_networks/
ArtificialIntelligenceArticles
Photo
Happy Birthday to the scientific discipline known as Artificial Intelligence.

"The Dartmouth Summer Research Project on Artificial Intelligence was the name of a 1956 summer workshop now considered by many to be the seminal event for artificial intelligence as a field."

"On July 13-15, 2005, the Dartmouth Artificial Intelligence Conference: The Next Fifty Years was held at the College, attended by 175 participants from all around the world, including 43 PhDs and Post Docs invited as guests to share their aspirations with the experts on the future of AI.
@ArtificialIntelligenceArticles

Thirty-one AI experts lectured the morning, afternoon, and evening sessions of AI@50, after an historic gathering in Baker Library on Wednesday afternoon, July 12, to honor the five surviving founders of AI and unveil the plaque commemorating the original Dartmouth Summer Research Project on Artificial Intelligence that created AI as a research discipline in 1956.

Present for that occasion were John McCarthy, back then a Dartmouth mathematics professor who first coined the term "artificial intelligence" to apply for a grant to fund the 1956 conference, along with four other founding colleagues - Marvin Minsky, Oliver Selfridge, Ray Solomonoff, and Trenchard More.

All five spoke at a Friday evening panel on the signficant AI achievements over the first half century, evoking many past "greats" associated with pioneering AI developments - most especially
@ArtificialIntelligenceArticles
Alan Newell and Herbert Simon, who discussed their Logic Theorist program in 1956 as an early indication of the future for artificial intelligence through digital computer development."

https://www.dartmouth.edu/~ai50/homepage.html
https://en.m.wikipedia.org/wiki/Dartmouth_workshop

https://t.iss.one/ArtificialIntelligenceArticles