ArtificialIntelligenceArticles
2.97K subscribers
1.64K photos
9 videos
5 files
3.86K links
for who have a passion for -
1. #ArtificialIntelligence
2. Machine Learning
3. Deep Learning
4. #DataScience
5. #Neuroscience

6. #ResearchPapers

7. Related Courses and Ebooks
Download Telegram
What goes around comes around: Cycle-Consistency-based Short-Term Motion Prediction for A... arxiv.org/abs/1908.03055
CGI faces will soon be indistinguishable from real ones. Here’s how
https://www.digitaltrends.com/cool-tech/cubic-motion-scanning-technology/
Centre for Computational Statistics and Machine Learning from UCL's Machine Learning Summer School (MLSS'19) video lectures are available here,

https://search.videoken.com/?orgId=198

The topics range from optimization and Bayesian inference to deep learning, reinforcement learning, and Gaussian processes. The lectures are of tutorial style, starts from basics, but then quickly picking up the pace so that after 2-4 hours of teaching, they arrive at the state of the art in the subject area.

#summerschool #machinelearning #tutorials #deeplearning #speechprocessing #reinforcementlearning
Must follow Github Repository
⚡️Contains +100 AI Cheatsheets
⚡️List of Free AI Courses
⚡️Free Online Books
⚡️Top 10 Online Books
⚡️Research Papers with codes
⚡️Top Videos&Lecture on AI+ML
⚡️+99 AI Researchers
⚡️Top website which should follow
⚡️+121 Free Datasets
⚡️+53 AI Framework and many more
All in one Github Repository
https://github.com/Niraj-Lunavat/Artificial-Intelligence
#Github #artificialIntellige­nce #ai #ml #machinelearning
Looking to fall in love... with science? 😍 Help scientists train machines to study stroke lesions by swiping on our app:

https://braindrles.us/#/

#citizenscience #braindr #braindrles #neuroscience #machinelearning #swipesforscience #openscience #OHBM2019
MegatronLM: Training Billion+ Parameter Language Models Using GPU Model Parallelism
"... training an 8.3 billion parameter transformer language model with 8-way model parallelism and 64-way data parallelism on 512 GPUs, making it the largest transformer based language model ever trained at 24x the size of BERT and 5.6x the size of GPT-2."
Blog by NVIDIA Applied Deep Learning Research : https://nv-adlr.github.io/MegatronLM
Code: https://github.com/nvidia/megatron-lm
#ArtificialIntelligence #DeepLearning #NLP #PyTorch #Transformer