360K subscribers
4.33K photos
816 videos
17 files
4.81K links
Погружаемся в машинное обучение и Data Science

Показываем как запускать любые LLm на пальцах.

По всем вопросам - @haarrp

@itchannels_telegram -🔥best channels

Реестр РКН: clck.ru/3Fmqri
Download Telegram
Introduction to Convolutional Neural Networks

The article focuses on explaining key components in CNN and its implementation using Keras python library.

https://www.kdnuggets.com/2020/06/introduction-convolutional-neural-networks.html
DetectoRS: Detecting Objects with Recursive Feature Pyramid and Switchable Atrous Convolution

Recursive Feature Pyramid implements thinking twice at the macro level, where the outputs of FPN are brought back to each stage of the bottom-up backbone through feedback connections

Github: https://github.com/joe-siyuan-qiao/DetectoRS

Paper: https://arxiv.org/abs/2006.02334v1
A Scalable and Cloud-Native Hyperparameter Tuning System

Katib is a Kubernetes-based system for Hyperparameter Tuning and Neural Architecture Search. Katib supports a number of ML frameworks, including TensorFlow, Apache MXNet, PyTorch, XGBoost, and others.

Github: https://github.com/kubeflow/katib

Getting started with Katib: https://www.kubeflow.org/docs/components/hyperparameter-tuning/hyperparameter/

Paper: https://arxiv.org/abs/2006.02085v1
Introducing Neuropod, Uber ATG’s Open Source Deep Learning Inference Engine

Neuropod makes it easy for researchers to build models in a framework of their choosing while also simplifying productionization of these models.
It currently supports TensorFlow, PyTorch, TorchScript, and Keras.

https://eng.uber.com/introducing-neuropod/

Github: https://github.com/uber/neuropod

Neuropod Tutorial: https://neuropod.ai/tutorial/
1
Generalized Focal Loss: Learning Qualified and Distributed Bounding Boxes for Dense Object Detection

Github: https://github.com/implus/GFocal

Paper: https://arxiv.org/abs/2006.04388v1
Deploy a Machine Learning Pipeline to the Cloud Using a Docker Container

https://www.kdnuggets.com/2020/06/deploy-machine-learning-pipeline-cloud-docker.html
VirTex: Learning Visual Representations from Textual Annotations

VirTex is a pretraining approach which uses semantically dense captions to learn visual representations.VirTex matches or outperforms models which use ImageNet for pretraining -- both supervised or unsupervised -- despite using up to 10x fewer images.

https://kdexd.github.io/virtex/

Github: https://github.com/kdexd/virtex

Paper: arxiv.org/abs/2006.06666
Rethinking the Truly Unsupervised Image-to-Image Translation - Official PyTorch Implementation

Here proposed the truly unsupervised image-to-image translation method (TUNIT) that simultaneously learns to separate image domains via an information-theoretic approach and generate corresponding images using the estimated domain labels.

Github: https://github.com/clovaai/tunit

Paper: https://arxiv.org/abs/2006.06500v1
From singing to musical scores: Estimating pitch with SPICE and Tensorflow Hub

Pitch is quantified by frequency, measured in Hertz (Hz), where one Hz corresponds to one cycle per second. The higher the frequency, the higher the note.

https://blog.tensorflow.org/2020/06/estimating-pitch-with-spice-and-tensorflow-hub.html

Model: https://tfhub.dev/google/spice/2

Colab code: https://colab.research.google.com/github/tensorflow/hub/blob/master/examples/colab/spice.ipynb
1
This media is not supported in your browser
VIEW IN TELEGRAM
SimCLR - A Simple Framework for Contrastive Learning of Visual Representations

The findings described in this paper can potentially be harnessed to improve accuracy in any application of computer vision where it is more expensive or difficult to label additional data than to train larger models.

Github: https://github.com/google-research/simclr

Paper: https://arxiv.org/abs/2006.10029
Data-Efficient GANs with DiffAugment

Differentiable Augmentation (DiffAugment), a simple method that improves the data efficiency of GANs by imposing various types of differentiable augmentations on both real and fake samples.

Github: https://github.com/mit-han-lab/data-efficient-gans

Paper: https://arxiv.org/abs/2006.10738

Training code: https://github.com/mit-han-lab/data-efficient-gans/tree/master/DiffAugment-stylegan2
This media is not supported in your browser
VIEW IN TELEGRAM
Machine Learning in Dask

In this article you can learn how Dask works with a huge dataset on local machine or in a distributed manner.

https://www.kdnuggets.com/2020/06/machine-learning-dask.html