On Artificial Intelligence
108 subscribers
27 photos
36 files
466 links
If you want to know more about Science, specially Artificial Intelligence, this is the right place for you
Admin Contact:
@Oriea
Download Telegram
Steps Toward Artificial Intelligence.pdf
5.8 MB
Discussions about several issues relevant to trial-and-error learning (Reinforcement Learning), including prediction,
expectation, and what this paper called the basic credit-assignment problem for complex reinforcement
learning systems: How do you distribute credit for success among the many
decisions that may have been involved in producing it? (Published 1961)
StatisticalLearningTheory.pdf
359.2 KB
A Brief introduction to Statistical Learning theory
Computational Learning Theory versus Statistical Learning Theory
While both frameworks use similar mathematical analysis, the primary difference between CoLT and SLT are their objectives. CoLT focuses on studying “learnability,” or what functions/features are necessary to make a given task learnable for an algorithm. Whereas SLT is primarily focused on studying and improving the accuracy of existing training programs.
https://deepai.org/machine-learning-glossary-and-terms/computational-learning-theory
sgd.pdf
3.9 MB
A brief introduction to stochastic approximation theory
How to write a book.pdf
548.2 KB
A 20 steps guide to write a book
Forwarded from Deleted Account
🔻پنجمین دورهمی مدرسه جهانی هوش مصنوعی در رشت
🔺School of AI

📘موضوع این جلسه: عصب شناسی محاسباتی و رابطه آن با هوش مصنوعی
🗓دوشنبه 11 شهریور 1398
🕐ساعت 16:30 الی 19:00
🏢 رشت، میدان انتظام (پل رازی)، پارک علم و فناوری گیلان

*جهت ثبت نام به لینک زیر مراجعه کنید:
▪️https://evnd.co/WtokW

برای کسب اطلاعات بیشتر درباره ی این مدرسه، به کانال زیر مراجعه کنید:
🏢 @schoolofairasht

@Brainandcognition_GU
What is “ML Ops”? Best Practices for DevOps for ML
https://www.itsalways10.com/what-is-ml-ops-best-practices-for-devops-for-ml/
A Shared Vision for Machine Learning in Neuroscience
https://www.jneurosci.org/content/jneuro/38/7/1601.full.pdf
Forwarded from Tensorflow(@CVision) (Vahid Reza Khazaie)
Fast-Bert

This library will help you build and deploy BERT based models within minutes:

Fast-Bert is the deep learning library that allows developers and data scientists to train and deploy BERT and XLNet based models for natural language processing tasks beginning with Text Classification.

The work on FastBert is built on solid foundations provided by the excellent Hugging Face BERT PyTorch library and is inspired by fast.ai and strives to make the cutting edge deep learning technologies accessible for the vast community of machine learning practitioners.

With FastBert, you will be able to:

Train (more precisely fine-tune) BERT, RoBERTa and XLNet text classification models on your custom dataset.

Tune model hyper-parameters such as epochs, learning rate, batch size, optimiser schedule and more.

Save and deploy trained model for inference (including on AWS Sagemaker).

Fast-Bert will support both multi-class and multi-label text classification for the following and in due course, it will support other NLU tasks such as Named Entity Recognition, Question Answering and Custom Corpus fine-tuning.

Blog post: https://medium.com/huggingface/introducing-fastbert-a-simple-deep-learning-library-for-bert-models-89ff763ad384

Code: https://github.com/kaushaltrivedi/fast-bert

#language_model #BERT
The Roles of Supervised Machine Learning in Systems Neuroscience
Over the last several years, the use of machine learning (ML) in neuroscience has been rapidly increasing. Here, we review ML’s contributions, both realized and potential, across several areas of systems neuroscience. We describe four primary roles of ML within neuroscience: 1) creating solutions to engineering problems, 2) identifying predictive variables, 3) setting benchmarks for simple models of the brain, and 4) serving itself as a model for the brain. The breadth and ease of its applicability suggests that machine learning should be in the toolbox of most systems neuroscientists.
https://arxiv.org/ftp/arxiv/papers/1805/1805.08239.pdf
#neuroscience #machine_learning