This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @amir_mahla: Deploy full-stack desktop agents in seconds with ScreenEnv! ✨
> Fully Sandboxed Desktop, isolated & reproducible.
> AI-native with MCP support
> Agents can see, click, type, browse, manage apps & files and more
> Runs in Docker, no VMs, no boilerplate
👇 Link in comments
🙏 Huge thank you to my teammate @AymericRoucher for their ideas, collaboration, and incredible energy during this release.
RT @amir_mahla: Deploy full-stack desktop agents in seconds with ScreenEnv! ✨
> Fully Sandboxed Desktop, isolated & reproducible.
> AI-native with MCP support
> Agents can see, click, type, browse, manage apps & files and more
> Runs in Docker, no VMs, no boilerplate
👇 Link in comments
🙏 Huge thank you to my teammate @AymericRoucher for their ideas, collaboration, and incredible energy during this release.
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @UFBots: Reachy Mini from @LeRobotHF @huggingface training up for UFB.
Wait till this little bugger get his arms/legs. He might just be the Ali of UFB 🤖🥊👑
RT @UFBots: Reachy Mini from @LeRobotHF @huggingface training up for UFB.
Wait till this little bugger get his arms/legs. He might just be the Ali of UFB 🤖🥊👑
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @_fracapuano: Today, we're releasing an open-source async inference stack for all models currently hosted on @huggingface, powering the world's cutest robots, built with love by the team at @LeRobotHF
Details in 🧵
RT @_fracapuano: Today, we're releasing an open-source async inference stack for all models currently hosted on @huggingface, powering the world's cutest robots, built with love by the team at @LeRobotHF
Details in 🧵
Hugging Face (Twitter)
RT @RisingSayak: Users of `torch.compile`. Some small performance tips:
1. Default to `fullgraph=True` to catch graph breaks as early as possible.
2. Check for recompilation triggers. Put your code under `torch._dynamo.config.patch(error_on_recompile=True)` context.
3. Use regional compilation almost always to cut down cold-start timing significantly.
Graph-breaks and frequent recompilations can easily come in the way of performance. Eliminate them as much as possible.
In Diffusers, we have a dedicated test suite for checking these things. Reference:
https://github.com/huggingface/diffusers/blob/941b7fc0843139e52419a65b7fa850169fde0360/tests/models/test_modeling_common.py#L1952
Immense thanks to @anijain2305 for always helping out!
RT @RisingSayak: Users of `torch.compile`. Some small performance tips:
1. Default to `fullgraph=True` to catch graph breaks as early as possible.
2. Check for recompilation triggers. Put your code under `torch._dynamo.config.patch(error_on_recompile=True)` context.
3. Use regional compilation almost always to cut down cold-start timing significantly.
Graph-breaks and frequent recompilations can easily come in the way of performance. Eliminate them as much as possible.
In Diffusers, we have a dedicated test suite for checking these things. Reference:
https://github.com/huggingface/diffusers/blob/941b7fc0843139e52419a65b7fa850169fde0360/tests/models/test_modeling_common.py#L1952
Immense thanks to @anijain2305 for always helping out!
GitHub
diffusers/tests/models/test_modeling_common.py at 941b7fc0843139e52419a65b7fa850169fde0360 · huggingface/diffusers
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX. - huggingface/diffusers
Hugging Face (Twitter)
RT @Xianbao_QIAN: Skywork-R1V 3.0: an open source model that beats close source models on multi-modal reasoning.
Link on @huggingface
https://huggingface.co/Skywork/Skywork-R1V3-38B
RT @Xianbao_QIAN: Skywork-R1V 3.0: an open source model that beats close source models on multi-modal reasoning.
Link on @huggingface
https://huggingface.co/Skywork/Skywork-R1V3-38B
Hugging Face (Twitter)
RT @apples_jimmy: I think I’m more excited for the openai opensource model than Gpt 5
RT @apples_jimmy: I think I’m more excited for the openai opensource model than Gpt 5
Hugging Face (Twitter)
RT @Xianbao_QIAN: Kimi K2 is open sourced on @huggingface
- 1T MoE, 32B active params
- Excellent coding & Tool use & Math
- Not a thinking model
- Both BASE and Instruct is released, friendly for fine-tunes!!!
https://huggingface.co/moonshotai/Kimi-K2-Base https://twitter.com/Xianbao_QIAN/status/1943621126652821617#m
RT @Xianbao_QIAN: Kimi K2 is open sourced on @huggingface
- 1T MoE, 32B active params
- Excellent coding & Tool use & Math
- Not a thinking model
- Both BASE and Instruct is released, friendly for fine-tunes!!!
https://huggingface.co/moonshotai/Kimi-K2-Base https://twitter.com/Xianbao_QIAN/status/1943621126652821617#m
Hugging Face (Twitter)
RT @ClementDelangue: 1T parameters, open-weights, just released on @huggingface!
RT @ClementDelangue: 1T parameters, open-weights, just released on @huggingface!
Hugging Face (Twitter)
RT @reach_vb: Pretty wild that @Kimi_Moonshot dropped a 1T parameter (32B active) MoE trained on 15.5 Trillion tokens - MIT licensed 🔥
Beats all other open weights models across coding, agentic and reasoning benchmarks
Ofcourse live on Hugging Face! 🤗
RT @reach_vb: Pretty wild that @Kimi_Moonshot dropped a 1T parameter (32B active) MoE trained on 15.5 Trillion tokens - MIT licensed 🔥
Beats all other open weights models across coding, agentic and reasoning benchmarks
Ofcourse live on Hugging Face! 🤗
Hugging Face (Twitter)
RT @rohanpaul_ai: 🇨🇳 INCREDIBLE. China just released 1tn parm top open source model for coding and agentic tool work.
Kimi K2 from Moonshot AI
Insane numbers on benchmarks.
On LiveCodeBench the model hits 53.7 Pass@1, beating DeepSeek‑V3 by almost 7 points and clearing Qwen‑235B by more than 16 points
Scores 65.8% on single‑shot SWE‑bench agentic coding and 70.6 on Tau2 retail tool use, numbers that sit at or near the top of the open stack.
- 1 tn total parameters MoE, 32Bn active
- Trained with the Muon optimizer
- Very strong across frontier knowledge, reasoning, and coding tasks
- SOTA on SWE Bench Verified, Tau2 & AceBench among open models
- Pre-trained n 15.5T tokens with zero training instability.
- Agentic Intelligence: Specifically designed for tool use, reasoning, and autonomous problem-solving.
- API endpoints mirror OpenAI and Anthropic schemas, while self‑hosters can load weights through vLLM, SGLang, KTransformers, or...
Перейти на оригинальный пост
RT @rohanpaul_ai: 🇨🇳 INCREDIBLE. China just released 1tn parm top open source model for coding and agentic tool work.
Kimi K2 from Moonshot AI
Insane numbers on benchmarks.
On LiveCodeBench the model hits 53.7 Pass@1, beating DeepSeek‑V3 by almost 7 points and clearing Qwen‑235B by more than 16 points
Scores 65.8% on single‑shot SWE‑bench agentic coding and 70.6 on Tau2 retail tool use, numbers that sit at or near the top of the open stack.
- 1 tn total parameters MoE, 32Bn active
- Trained with the Muon optimizer
- Very strong across frontier knowledge, reasoning, and coding tasks
- SOTA on SWE Bench Verified, Tau2 & AceBench among open models
- Pre-trained n 15.5T tokens with zero training instability.
- Agentic Intelligence: Specifically designed for tool use, reasoning, and autonomous problem-solving.
- API endpoints mirror OpenAI and Anthropic schemas, while self‑hosters can load weights through vLLM, SGLang, KTransformers, or...
Перейти на оригинальный пост
Hugging Face (Twitter)
RT @AIatAMD: 🚀 We’re excited to partner with @HuggingFace to launch a new section of their MCP Course: Local Tiny Agents with AMD NPU and iGPU Acceleration — powered by Lemonade Server 🍋 https://github.com/lemonade-sdk/lemonade
In this hands-on module, you’ll learn how to:
✅ Accelerate end-to-end Tiny Agents applications using AMD’s Neural Processing Unit (NPU) and integrated GPU (iGPU)
✅ Enable local file access and build assistants that handle sensitive data entirely on-device — ensuring maximum privacy and performance
We’re proud to support developers building smarter, faster, and more private AI agents.
🔗 Dive into the course: https://huggingface.co/learn/mcp-course/unit2/lemonade-server
⭐ Star our Lemonade GitHub repo: https://github.com/lemonade-sdk/lemonade
#AMD #HuggingFace #TinyAgents #EdgeAI #NPUs #iGPU #LemonadeServer #MCP #AIAcceleration
RT @AIatAMD: 🚀 We’re excited to partner with @HuggingFace to launch a new section of their MCP Course: Local Tiny Agents with AMD NPU and iGPU Acceleration — powered by Lemonade Server 🍋 https://github.com/lemonade-sdk/lemonade
In this hands-on module, you’ll learn how to:
✅ Accelerate end-to-end Tiny Agents applications using AMD’s Neural Processing Unit (NPU) and integrated GPU (iGPU)
✅ Enable local file access and build assistants that handle sensitive data entirely on-device — ensuring maximum privacy and performance
We’re proud to support developers building smarter, faster, and more private AI agents.
🔗 Dive into the course: https://huggingface.co/learn/mcp-course/unit2/lemonade-server
⭐ Star our Lemonade GitHub repo: https://github.com/lemonade-sdk/lemonade
#AMD #HuggingFace #TinyAgents #EdgeAI #NPUs #iGPU #LemonadeServer #MCP #AIAcceleration
Hugging Face (Twitter)
RT @Kimi_Moonshot: 🚀 Hello, Kimi K2! Open-Source Agentic Model!
🔹 1T total / 32B active MoE model
🔹 SOTA on SWE Bench Verified, Tau2 & AceBench among open models
🔹Strong in coding and agentic tasks
🐤 Multimodal & thought-mode not supported for now
With Kimi K2, advanced agentic intelligence is more open and accessible than ever. We can't wait to see what you build!
🔌 API is here: platform.moonshot.ai
- $0.15 / million input tokens (cache hit)
- $0.60 / million input tokens (cache miss)
- $2.50 / million output tokens
🔗 Tech blog: https://moonshotai.github.io/Kimi-K2/
🔗 Weights & code: huggingface.co/moonshotai
🔗 Github: https://github.com/MoonshotAI/Kimi-K2
Try it now at Kimi.ai or via API!
RT @Kimi_Moonshot: 🚀 Hello, Kimi K2! Open-Source Agentic Model!
🔹 1T total / 32B active MoE model
🔹 SOTA on SWE Bench Verified, Tau2 & AceBench among open models
🔹Strong in coding and agentic tasks
🐤 Multimodal & thought-mode not supported for now
With Kimi K2, advanced agentic intelligence is more open and accessible than ever. We can't wait to see what you build!
🔌 API is here: platform.moonshot.ai
- $0.15 / million input tokens (cache hit)
- $0.60 / million input tokens (cache miss)
- $2.50 / million output tokens
🔗 Tech blog: https://moonshotai.github.io/Kimi-K2/
🔗 Weights & code: huggingface.co/moonshotai
🔗 Github: https://github.com/MoonshotAI/Kimi-K2
Try it now at Kimi.ai or via API!
Hugging Face (Twitter)
RT @AndrewCurran_: Rumblings all morning it was going to arrive, and here it is. Open source, and comparable to the best models in the world. https://twitter.com/Kimi_Moonshot/status/1943687594560332025#m
RT @AndrewCurran_: Rumblings all morning it was going to arrive, and here it is. Open source, and comparable to the best models in the world. https://twitter.com/Kimi_Moonshot/status/1943687594560332025#m
Hugging Face (Twitter)
RT @ClementDelangue: Kimi K2 has just been deployed and you can try its 1T parameters on the Hugging Face model page already thanks to @novita_labs!
RT @ClementDelangue: Kimi K2 has just been deployed and you can try its 1T parameters on the Hugging Face model page already thanks to @novita_labs!
Hugging Face (Twitter)
RT @ClementDelangue: In the meantime, you can follow hf.co/openai to be first to know when it’s out. Just crossed 10k followers today. https://twitter.com/sama/status/1943837550369812814#m
RT @ClementDelangue: In the meantime, you can follow hf.co/openai to be first to know when it’s out. Just crossed 10k followers today. https://twitter.com/sama/status/1943837550369812814#m
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @ClementDelangue: We're open-sourcing "The Amazing Hand", an eight-degree of freedom humanoid robot hand compatible with @lerobot that can be 3-D printed at home for less than $250 ✌️✌️✌️
Given the success of Reachy Mini (2,000+ robots sold in a few days), we won't have the bandwidth to manufacture this one ourselves but we release the bill of materials, the CAD files and assembly guides for everyone to build or sell their own, let's go open-source AI robotics!
RT @ClementDelangue: We're open-sourcing "The Amazing Hand", an eight-degree of freedom humanoid robot hand compatible with @lerobot that can be 3-D printed at home for less than $250 ✌️✌️✌️
Given the success of Reachy Mini (2,000+ robots sold in a few days), we won't have the bandwidth to manufacture this one ourselves but we release the bill of materials, the CAD files and assembly guides for everyone to build or sell their own, let's go open-source AI robotics!
Hugging Face (Twitter)
RT @ThuleanFuturist: I can’t believe nobody is talking about out the Kimi K2 model
It’s unbelievable, it’s open source, 1T parameter model. This should be the biggest story in tech right now
RT @ThuleanFuturist: I can’t believe nobody is talking about out the Kimi K2 model
It’s unbelievable, it’s open source, 1T parameter model. This should be the biggest story in tech right now
Hugging Face (Twitter)
RT @cedric_chee: Holy shit! Kimi K2 one-shotted Minecraft for the web that took me 4 days and 6 attempts using Gemini 2.5 Pro.
RT @cedric_chee: Holy shit! Kimi K2 one-shotted Minecraft for the web that took me 4 days and 6 attempts using Gemini 2.5 Pro.
Hugging Face (Twitter)
RT @LiquidAI_: Try LFM2 with llama.cpp today!
We released today a collection of GGUF checkpoints for developers to run LFM2 everywhere with llama.cpp
Select the most relevant precision for your use case and start building today.
https://huggingface.co/LiquidAI/LFM2-1.2B-GGUF
RT @LiquidAI_: Try LFM2 with llama.cpp today!
We released today a collection of GGUF checkpoints for developers to run LFM2 everywhere with llama.cpp
Select the most relevant precision for your use case and start building today.
https://huggingface.co/LiquidAI/LFM2-1.2B-GGUF