On Artificial Intelligence
108 subscribers
27 photos
36 files
466 links
If you want to know more about Science, specially Artificial Intelligence, this is the right place for you
Admin Contact:
@Oriea
Download Telegram
Forwarded from School of AI
zisserman-self-supervised.pdf
9.1 MB
A tutorial on Self-supervised Learning by Andrew Zisserman from Google Deep Mind.

Talks in ICML2019:
https://www.facebook.com/icml.imls/videos/2030095370631729/
Empirically, XLNet outperforms BERT on 20 tasks, often by a large margin, and achieves state-of-the-art results on 18 tasks including question answering, natural language inference, sentiment analysis, and document ranking.

https://arxiv.org/abs/1906.08237#
#NLP
A great tutorial which shows you how you can implement algorithms above, in Python:
https://github.com/MorvanZhou/Evolutionary-Algorithm
Forwarded from Tensorflow(@CVision) (Vahid Reza Khazaie)
New Google Brain Optimizer Reduces BERT Pre-Training Time From Days to Minutes

کاهش مدت زمان pre-training مدل زبانی BERT از سه روز به 76 دقیقه با ارائه یک تابع بهینه ساز جدید!

Google Brain researchers have proposed LAMB (Layer-wise Adaptive Moments optimizer for Batch training), a new optimizer which reduces training time for its NLP training model BERT (Bidirectional Encoder Representations from Transformers) from three days to just 76 minutes.

لینک مقاله: https://arxiv.org/abs/1904.00962
لینک بلاگ پست: https://medium.com/syncedreview/new-google-brain-optimizer-reduces-bert-pre-training-time-from-days-to-minutes-b454e54eda1d

#BERT #language_model #optimizer