Artificial Intelligence
16.3K subscribers
1.09K photos
7 videos
1 file
1.95K links
Artificial Intelligence

admin - @haarrp

@itchannels_telegram - πŸ”₯ best it channels

@ai_machinelearning_big_data - Machine learning channel

@pythonl - Our Python channel

@pythonlbooks- python ΠΊΠ½ΠΈΠ³ΠΈπŸ“š

@datascienceiot - ml πŸ“š

РКН: clck.ru/3FmwZw
Download Telegram
Full camouflage fixation training dataset is available!

The full camouflage fixation training dataset is available with the full fixation maps for the COD10K training dataset, which can be downloaded from: https://drive.google.com/file/d/1inb5iNTDswFPDm4SpzBbVgZdI4puAv_3/view?usp=sharing

Github: https://github.com/JingZhang617/COD-Rank-Localize-and-Segment

Paper: https://arxiv.org/abs/2205.11333v1

Dataset: https://paperswithcode.com/dataset/salicon

@ArtificialIntelligencedl
πŸ‘3
πŸ‘4
On the Eigenvalues of Global Covariance Pooling for Fine-grained Visual Recognition

Github: https://github.com/KingJamesSong/DifferentiableSVD

Paper: https://arxiv.org/abs/2205.13282v1

Dataset: https://paperswithcode.com/dataset/imagenet

@ArtificialIntelligencedl
πŸ‘6
SemAffiNet: Semantic-Affine Transformation for Point Cloud Segmentation

Github: https://github.com/wangzy22/SemAffiNet

Paper: https://arxiv.org/abs/2205.13490v1

Dataset: https://paperswithcode.com/dataset/cityscapes

@ArtificialIntelligencedl
❀3πŸ‘1
This media is not supported in your browser
VIEW IN TELEGRAM
CMA-ES with Margin

CMA-ES with Margin (CMA-ESwM) [1] is a CMA-ES variant proposed for mixed-integer black-box optimization, which introduces a lower bound on the marginal probability associated with integer variables.

Github: https://github.com/evoconjp/cma-es_with_margin

Paper: https://arxiv.org/abs/2205.13482v1

@ArtificialIntelligencedl
πŸ‘5
πŸ‘5
πŸ“ Awesome Artificial Intelligence

List

@ArtificialIntelligencedl
πŸ‘4πŸ”₯3
Good Intentions: Adaptive Parameter Servers via Intent Signaling

AdaPS is efficient for many machine learning tasks out of the box because it automatically adapts to the underlying task

Github: https://github.com/alexrenz/adaps

Paper: https://arxiv.org/abs/2206.00470v1

Docs: https://github.com/alexrenz/AdaPS/blob/vldb20/docs/experiments-vldb20.md

@ArtificialIntelligencedl
πŸ‘6
πŸ” PanopticDepth: A Unified Framework for Depth-aware Panoptic Segmentation

Github: https://github.com/naiyugao/panopticdepth

Paper: https://arxiv.org/abs/2206.00468

Dataset: https://paperswithcode.com/dataset/cityscapes

@ArtificialIntelligencedl
πŸ‘5
β›© XBound-Former: Toward Cross-scale Boundary Modeling in Transformers

Github: https://github.com/naiyugao/panopticdepth

Paper: https://arxiv.org/abs/2206.00806v1

Dataset: https://paperswithcode.com/dataset/kvasir-seg

@ArtificialIntelligencedl
πŸ‘6
Forwarded from Python/ django
πŸ”ŠA Python library for audio feature extraction, classification, segmentation and applications.

Code: PyAudioAnalysis

#Python #Audio #Analyzer

@pythonl
πŸ‘5❀1
OntoMerger: An Ontology Integration Library for Deduplicating and Connecting Knowledge Graph Nodes

OntoMerger is an ontology alignment library for deduplicating knowledge graph nodes

Github: https://github.com/astrazeneca/onto_merger

Paper: https://arxiv.org/abs/2206.02238v1

Documentation: https://ontomerger.readthedocs.io/

@ArtificialIntelligencedl
πŸ‘6
Tutel: Adaptive Mixture-of-Experts at Scale

Tutel, a highly scalable stack design and implementation for MoE with dynamically adaptive parallelism and pipelining.

Github: https://github.com/microsoft/tutel

Examples: https://github.com/microsoft/tutel/blob/main/tutel/examples

Paper: https://paperswithcode.com/dataset/coco

Documentation: https://ontomerger.readthedocs.io/

@ArtificialIntelligencedl
πŸ‘6
πŸ“Œ Sparse Fusion Mixture-of-Experts are Domain Generalizable Learners

Sparse Fusion Mixture-of-Experts (SF-MoE), which incorporates sparsity and fusion mechanisms into the MoE framework to keep the model both sparse and predictive.

Github: https://github.com/luodian/sf-moe-dg

Paper: https://arxiv.org/abs/2206.04046v1

Documentation: https://paperswithcode.com/dataset/domainnet

@ArtificialIntelligencedl
πŸ‘4
πŸ”Ή PointNeXt & OpenPoints Library

improved training and model scaling strategies to boost PointNet++ to the state-of-the-art level.

Github: https://github.com/guochengqian/pointnext

Paper: https://paperswithcode.com/dataset/shapenet

@ArtificialIntelligencedl
πŸ‘5
Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models

Github: https://github.com/google/BIG-bench

Paper: https://arxiv.org/abs/2206.04615v1

Dataset: https://paperswithcode.com/dataset/glue

@ArtificialIntelligencedl
❀8