An important collection of the 15 best machine learning cheat sheets.
مجموعة مهمة الافضل ١٥ ورقة غش في مجال التعلم الآلي.
1- Supervised Learning
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-supervised-learning.pdf
2- Unsupervised Learning
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-unsupervised-learning.pdf
3- Deep Learning
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-deep-learning.pdf
4- Machine Learning Tips and Tricks
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-machine-learning-tips-and-tricks.pdf
5- Probabilities and Statistics
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/refresher-probabilities-statistics.pdf
6- Comprehensive Stanford Master Cheat Sheet
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/super-cheatsheet-machine-learning.pdf
7- Linear Algebra and Calculus
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/refresher-algebra-calculus.pdf
8- Data Science Cheat Sheet
https://s3.amazonaws.com/assets.datacamp.com/blog_assets/PythonForDataScience.pdf
9- Keras Cheat Sheet
https://s3.amazonaws.com/assets.datacamp.com/blog_assets/Keras_Cheat_Sheet_Python.pdf
10- Deep Learning with Keras Cheat Sheet
https://github.com/rstudio/cheatsheets/raw/master/keras.pdf
11- Visual Guide to Neural Network Infrastructures
https://www.asimovinstitute.org/wp-content/uploads/2016/09/neuralnetworks.png
12- Skicit-Learn Python Cheat Sheet
https://s3.amazonaws.com/assets.datacamp.com/blog_assets/Scikit_Learn_Cheat_Sheet_Python.pdf
13- Scikit-learn Cheat Sheet: Choosing the Right Estimator
https://scikit-learn.org/stable/tutorial/machine_learning_map/
14- Tensorflow Cheat Sheet
https://github.com/kailashahirwar/cheatsheets-ai/blob/master/PDFs/Tensorflow.pdf
15- Machine Learning Test Cheat Sheet
https://www.cheatography.com/lulu-0012/cheat-sheets/test-ml/pdf/
✳️ ساهم بنمو مجتمعنا من خلال اضافة الاصدقاء او مشاركة المنشور.
مجموعة مهمة الافضل ١٥ ورقة غش في مجال التعلم الآلي.
1- Supervised Learning
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-supervised-learning.pdf
2- Unsupervised Learning
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-unsupervised-learning.pdf
3- Deep Learning
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-deep-learning.pdf
4- Machine Learning Tips and Tricks
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/cheatsheet-machine-learning-tips-and-tricks.pdf
5- Probabilities and Statistics
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/refresher-probabilities-statistics.pdf
6- Comprehensive Stanford Master Cheat Sheet
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/super-cheatsheet-machine-learning.pdf
7- Linear Algebra and Calculus
https://github.com/afshinea/stanford-cs-229-machine-learning/blob/master/en/refresher-algebra-calculus.pdf
8- Data Science Cheat Sheet
https://s3.amazonaws.com/assets.datacamp.com/blog_assets/PythonForDataScience.pdf
9- Keras Cheat Sheet
https://s3.amazonaws.com/assets.datacamp.com/blog_assets/Keras_Cheat_Sheet_Python.pdf
10- Deep Learning with Keras Cheat Sheet
https://github.com/rstudio/cheatsheets/raw/master/keras.pdf
11- Visual Guide to Neural Network Infrastructures
https://www.asimovinstitute.org/wp-content/uploads/2016/09/neuralnetworks.png
12- Skicit-Learn Python Cheat Sheet
https://s3.amazonaws.com/assets.datacamp.com/blog_assets/Scikit_Learn_Cheat_Sheet_Python.pdf
13- Scikit-learn Cheat Sheet: Choosing the Right Estimator
https://scikit-learn.org/stable/tutorial/machine_learning_map/
14- Tensorflow Cheat Sheet
https://github.com/kailashahirwar/cheatsheets-ai/blob/master/PDFs/Tensorflow.pdf
15- Machine Learning Test Cheat Sheet
https://www.cheatography.com/lulu-0012/cheat-sheets/test-ml/pdf/
✳️ ساهم بنمو مجتمعنا من خلال اضافة الاصدقاء او مشاركة المنشور.
GitHub
stanford-cs-229-machine-learning/en/cheatsheet-supervised-learning.pdf at master · afshinea/stanford-cs-229-machine-learning
VIP cheatsheets for Stanford's CS 229 Machine Learning - afshinea/stanford-cs-229-machine-learning
👍4❤🔥2
This media is not supported in your browser
VIEW IN TELEGRAM
X-Avatar: Expressive Human Avatars
🖥 Github: https://github.com/Skype-line/X-Avatar
⏩ Paper: https://arxiv.org/abs/2303.04805
💨 Dataset: https://github.com/Skype-line/X-Avatar/blob/main/xxx
⏩ Project: https://skype-line.github.io/projects/X-Avatar/
https://t.iss.one/DataScienceT
🖥 Github: https://github.com/Skype-line/X-Avatar
⏩ Paper: https://arxiv.org/abs/2303.04805
💨 Dataset: https://github.com/Skype-line/X-Avatar/blob/main/xxx
⏩ Project: https://skype-line.github.io/projects/X-Avatar/
https://t.iss.one/DataScienceT
❤🔥2
Datasets
Datasets collected for network science, deep learning and general machine learning research.
Github: https://github.com/benedekrozemberczki/datasets
Paper: https://arxiv.org/abs/2101.03091v1
Invite your friends 🌹🌹
@DataScience_Books
Datasets collected for network science, deep learning and general machine learning research.
Github: https://github.com/benedekrozemberczki/datasets
Paper: https://arxiv.org/abs/2101.03091v1
Invite your friends 🌹🌹
@DataScience_Books
👍4❤🔥3
Multivariate Probabilistic Time Series Forecasting with Informer
Efficient transformer-based model for LSTF.
Method introduces a Probabilistic Attention mechanism to select the “active” queries rather than the “lazy” queries and provides a sparse Transformer thus mitigating the quadratic compute and memory requirements of vanilla attention.
🤗Hugging face:
https://huggingface.co/blog/informer
⏩ Paper:
https://huggingface.co/docs/transformers/main/en/model_doc/informer
⭐️ Colab:
https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/multivariate_informer.ipynb
💨 Dataset:
https://huggingface.co/docs/datasets/v2.7.0/en/package_reference/main_classes#datasets.Dataset.set_transform
https://t.iss.one/DataScienceT
Efficient transformer-based model for LSTF.
Method introduces a Probabilistic Attention mechanism to select the “active” queries rather than the “lazy” queries and provides a sparse Transformer thus mitigating the quadratic compute and memory requirements of vanilla attention.
🤗Hugging face:
https://huggingface.co/blog/informer
⏩ Paper:
https://huggingface.co/docs/transformers/main/en/model_doc/informer
⭐️ Colab:
https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/multivariate_informer.ipynb
💨 Dataset:
https://huggingface.co/docs/datasets/v2.7.0/en/package_reference/main_classes#datasets.Dataset.set_transform
https://t.iss.one/DataScienceT
❤🔥2👍1
Linear Algebra in Python: Matrix Inverses and Least Squares
https://realpython.com/python-linear-algebra/
https://realpython.com/python-linear-algebra/
❤🔥4👍1
GPT-4 Technical Report
Source code: https://github.com/openai/evals
Paper: https://cdn.openai.com/papers/gpt-4.pdf
Source code: https://github.com/openai/evals
Paper: https://cdn.openai.com/papers/gpt-4.pdf
❤🔥4
Tuned Lens 🔎
Simple interface training and evaluating tuned lenses. A tuned lens allows us to peak at the iterative computations a transformer uses to compute the next token.
🖥 Github: https://github.com/alignmentresearch/tuned-lens
⏩ Paper: https://arxiv.org/abs/2303.08112v1
⭐️ Dataset: https://paperswithcode.com/dataset/the-pile
🖥 Colab: https://colab.research.google.com/github/AlignmentResearch/tuned-lens/blob/main/notebooks/interactive.ipynb
https://t.iss.one/DataScienceT
Simple interface training and evaluating tuned lenses. A tuned lens allows us to peak at the iterative computations a transformer uses to compute the next token.
pip install tuned-lens
🖥 Github: https://github.com/alignmentresearch/tuned-lens
⏩ Paper: https://arxiv.org/abs/2303.08112v1
⭐️ Dataset: https://paperswithcode.com/dataset/the-pile
🖥 Colab: https://colab.research.google.com/github/AlignmentResearch/tuned-lens/blob/main/notebooks/interactive.ipynb
https://t.iss.one/DataScienceT
❤🔥6
OpenSeeD
A Simple Framework for Open-Vocabulary Segmentation and Detection
🖥 Github: https://github.com/idea-research/openseed
⏩ Paper: https://arxiv.org/abs/2303.08131v2
💨 Dataset: https://paperswithcode.com/dataset/objects365
https://t.iss.one/DataScienceT
A Simple Framework for Open-Vocabulary Segmentation and Detection
🖥 Github: https://github.com/idea-research/openseed
⏩ Paper: https://arxiv.org/abs/2303.08131v2
💨 Dataset: https://paperswithcode.com/dataset/objects365
https://t.iss.one/DataScienceT
❤🔥3
Contrastive Semi-supervised Learning for Underwater Image Restoration via Reliable Bank
🖥 Github: https://github.com/huang-shirui/semi-uir
⏩ Paper: https://arxiv.org/abs/2303.09101v1
💨 Project: https://paperswithcode.com/dataset/uieb
https://t.iss.one/DataScienceT
🖥 Github: https://github.com/huang-shirui/semi-uir
⏩ Paper: https://arxiv.org/abs/2303.09101v1
💨 Project: https://paperswithcode.com/dataset/uieb
https://t.iss.one/DataScienceT
❤🔥2
This media is not supported in your browser
VIEW IN TELEGRAM
WebSHAP: Towards Explaining Any Machine Learning Models Anywhere
🖥 Github: https://github.com/poloclub/webshap
⏩ Paper: https://arxiv.org/abs/2303.09545v1
💨 Project: https://poloclub.github.io/webshap
https://t.iss.one/DataScienceT
🖥 Github: https://github.com/poloclub/webshap
⏩ Paper: https://arxiv.org/abs/2303.09545v1
💨 Project: https://poloclub.github.io/webshap
https://t.iss.one/DataScienceT
❤🔥3👍1
🖥 GigaGAN - Pytorch
Implementation of GigaGAN, new SOTA GAN out of Adobe.
https://github.com/lucidrains/gigagan-pytorch
https://t.iss.one/DataScienceT
Implementation of GigaGAN, new SOTA GAN out of Adobe.
https://github.com/lucidrains/gigagan-pytorch
https://t.iss.one/DataScienceT
❤🔥2
Taming Diffusion Models for Audio-Driven Co-Speech Gesture Generation (CVPR 2023)
Novel Diffusion Audio-Gesture Transformer is devised to better attend to the information from multiple modalities and model the long-term temporal dependency.
🖥 Github: https://github.com/advocate99/diffgesture
⏩ Paper: https://arxiv.org/abs/2303.09119v1
💨 Dataset: https://paperswithcode.com/dataset/beat
https://t.iss.one/DataScienceT
Novel Diffusion Audio-Gesture Transformer is devised to better attend to the information from multiple modalities and model the long-term temporal dependency.
🖥 Github: https://github.com/advocate99/diffgesture
⏩ Paper: https://arxiv.org/abs/2303.09119v1
💨 Dataset: https://paperswithcode.com/dataset/beat
https://t.iss.one/DataScienceT
👍3❤🔥2
Deep Metric Learning for Unsupervised CD
🖥 Github: https://github.com/wgcban/metric-cd
⏩ Paper: https://arxiv.org/abs/2303.09536v1
https://t.iss.one/DataScienceT
🖥 Github: https://github.com/wgcban/metric-cd
⏩ Paper: https://arxiv.org/abs/2303.09536v1
https://t.iss.one/DataScienceT
👍2❤🔥1
This media is not supported in your browser
VIEW IN TELEGRAM
⚜️ ViperGPT: Visual Inference via Python Execution for Reasoning
ViperGPT, a framework that leverages code-generation models to compose vision-and-language models into subroutines to produce a result for any query.
🖥 Github: https://github.com/cvlab-columbia/viper
⏩ Paper: https://arxiv.org/pdf/2303.08128.pdf
💨 Project: https://paperswithcode.com/dataset/beat
https://t.iss.one/DataScienceT
ViperGPT, a framework that leverages code-generation models to compose vision-and-language models into subroutines to produce a result for any query.
🖥 Github: https://github.com/cvlab-columbia/viper
⏩ Paper: https://arxiv.org/pdf/2303.08128.pdf
💨 Project: https://paperswithcode.com/dataset/beat
https://t.iss.one/DataScienceT
👍3🏆2❤🔥1
🎥 Zero-1-to-3: Zero-shot One Image to 3D Object
Zero-1-to-3, a framework for changing the camera viewpoint of an object given just a single RGB image.
🖥 Github: https://github.com/cvlab-columbia/zero123
🤗 Hugging face: https://huggingface.co/spaces/cvlab/zero123-live
⏩ Paper: https://arxiv.org/abs/2303.11328v1
⏩ Dataset: https://zero123.cs.columbia.edu/
💨 Project: https://paperswithcode.com/dataset/beat
⭐️ Demo: https://huggingface.co/spaces/cvlab/zero123
https://t.iss.one/DataScienceT
Zero-1-to-3, a framework for changing the camera viewpoint of an object given just a single RGB image.
🖥 Github: https://github.com/cvlab-columbia/zero123
🤗 Hugging face: https://huggingface.co/spaces/cvlab/zero123-live
⏩ Paper: https://arxiv.org/abs/2303.11328v1
⏩ Dataset: https://zero123.cs.columbia.edu/
💨 Project: https://paperswithcode.com/dataset/beat
⭐️ Demo: https://huggingface.co/spaces/cvlab/zero123
https://t.iss.one/DataScienceT
❤3❤🔥3🏆2👍1
MIT Introduction to Deep Learning - 2023 Starting soon! MIT Intro to DL is one of the most concise AI courses on the web that cover basic deep learning techniques, architectures, and applications.
2023 lectures are starting in just one day, Jan 9th!
Link to register:
https://introtodeeplearning.com
MIT Introduction to Deep Learning The 2022 lectures can be found here:
https://m.youtube.com/playlist?list=PLtBw6njQRU-rwp5__7C0oIVt26ZgjG9NI
https://t.iss.one/DataScienceT
2023 lectures are starting in just one day, Jan 9th!
Link to register:
https://introtodeeplearning.com
MIT Introduction to Deep Learning The 2022 lectures can be found here:
https://m.youtube.com/playlist?list=PLtBw6njQRU-rwp5__7C0oIVt26ZgjG9NI
https://t.iss.one/DataScienceT
❤🔥3👍3🏆2
Train your ControlNet with diffusers 🧨
ControlNet is a neural network structure that allows fine-grained control of diffusion models by adding extra conditions.
🤗 Hugging face: https://huggingface.co/blog/train-your-controlnet#
🖥 Github: https://github.com/huggingface/blog/blob/main/train-your-controlnet.md
⏩ ControlNet training example: https://github.com/huggingface/diffusers/tree/main/examples/controlnet
https://t.iss.one/DataScienceT
ControlNet is a neural network structure that allows fine-grained control of diffusion models by adding extra conditions.
🤗 Hugging face: https://huggingface.co/blog/train-your-controlnet#
🖥 Github: https://github.com/huggingface/blog/blob/main/train-your-controlnet.md
⏩ ControlNet training example: https://github.com/huggingface/diffusers/tree/main/examples/controlnet
https://t.iss.one/DataScienceT
❤🔥3🏆2