Hugging Face
85 subscribers
780 photos
268 videos
1.33K links
Download Telegram
Hugging Face (Twitter)

RT @Prince_Canuma: Can I get early access in 2025 @ClementDelangue? πŸ₯²

I wanna build examples for the MLX community. https://twitter.com/ClementDelangue/status/1942960661371216195#m
Hugging Face (Twitter)

RT @abhinadduri: We updated the State Embedding 600M checkpoint on the @ArcInstitute Hugging Face

This model was trained with 4x FLOPs compared to the preprint model. It achieves significantly lower val/loss and does better on internal evals - would recommend using this over the 4 epoch one for single-cell embeddings!

Preprint: https://www.biorxiv.org/content/10.1101/2025.06.26.661135v1

Hugging Face: https://huggingface.co/arcinstitute/SE-600M
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @saurabh_ai_news: The GitHub for AI just dropped their first robot. πŸ€–

Hugging Face (@huggingface) & @pollenrobotics are launching Reachy Mini.

An affordable, hackable, open-source robot for everyone, powered by the community.

This is huge.
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @dylan_ebert_: Which Generative 3D produces the best topology?

βš”οΈ 3D Arena now has Topology-only voting/rankings
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @BrianRoemmele: New Open Source Robot!

Meet Reachy.

The Reachy Mini from Hugging Face, and it’s an impressive open-source robot for AI and robotics enthusiasts.

Priced at $299, this 11-inch Python-programmable kit, with JavaScript and Scratch support coming soon, is perfect for developers, educators, and hobbyists. Its wide-angle camera, microphones, and 6DOF head movement enable seamless human-robot interaction and AI experimentation.

The Lite version is $299 or the Wireless at $449 with Raspberry Pi 5, and connect with a dynamic community to code and innovate. With fully open-source hardware, software, and Hugging Face AI model integration, this will be the ultimate testing ground.

Details: hf.co/blog/reachy-mini
Community: https://discord.com/channels/519098054377340948/1377671369893875783
Video: https://youtube.com/watch?v=JvdBJZ-qR18
A must-try for anyone exploring the future of robotics!
Hugging Face (Twitter)

RT @casper_hansen_: Step 2 of many: Last week, I released a biomedical dataset of 521k samples.

This week, I released full-text embeddings (32k) with 2048 dimension from Qwen3 4B embedding model.
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @amir_mahla: Deploy full-stack desktop agents in seconds with ScreenEnv! ✨

> Fully Sandboxed Desktop, isolated & reproducible.
> AI-native with MCP support
> Agents can see, click, type, browse, manage apps & files and more
> Runs in Docker, no VMs, no boilerplate

πŸ‘‡ Link in comments

πŸ™ Huge thank you to my teammate @AymericRoucher for their ideas, collaboration, and incredible energy during this release.
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @UFBots: Reachy Mini from @LeRobotHF @huggingface training up for UFB.

Wait till this little bugger get his arms/legs. He might just be the Ali of UFB πŸ€–πŸ₯ŠπŸ‘‘
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @_fracapuano: Today, we're releasing an open-source async inference stack for all models currently hosted on @huggingface, powering the world's cutest robots, built with love by the team at @LeRobotHF

Details in 🧡
β€ŒHugging Face (Twitter)

RT @RisingSayak: Users of `torch.compile`. Some small performance tips:

1. Default to `fullgraph=True` to catch graph breaks as early as possible.

2. Check for recompilation triggers. Put your code under `torch._dynamo.config.patch(error_on_recompile=True)` context.

3. Use regional compilation almost always to cut down cold-start timing significantly.

Graph-breaks and frequent recompilations can easily come in the way of performance. Eliminate them as much as possible.

In Diffusers, we have a dedicated test suite for checking these things. Reference:
https://github.com/huggingface/diffusers/blob/941b7fc0843139e52419a65b7fa850169fde0360/tests/models/test_modeling_common.py#L1952

Immense thanks to @anijain2305 for always helping out!
Hugging Face (Twitter)

RT @Xianbao_QIAN: Skywork-R1V 3.0: an open source model that beats close source models on multi-modal reasoning.

Link on @huggingface
https://huggingface.co/Skywork/Skywork-R1V3-38B
Hugging Face (Twitter)

RT @apples_jimmy: I think I’m more excited for the openai opensource model than Gpt 5
Hugging Face (Twitter)

RT @Xianbao_QIAN: Kimi K2 is open sourced on @huggingface

- 1T MoE, 32B active params
- Excellent coding & Tool use & Math
- Not a thinking model

- Both BASE and Instruct is released, friendly for fine-tunes!!!

https://huggingface.co/moonshotai/Kimi-K2-Base https://twitter.com/Xianbao_QIAN/status/1943621126652821617#m
Hugging Face (Twitter)

RT @ClementDelangue: 1T parameters, open-weights, just released on @huggingface!