Hugging Face
69 subscribers
707 photos
245 videos
1.21K links
Download Telegram
Hugging Face (Twitter)

RT @MaziyarPanahi: Perfect Sunday: I just used Kimi-K2 by @Kimi_Moonshot to vibe code a @Gradio app! πŸ”₯

You can use "Anycoder" Space by @_akhaliq hosted on @huggingface for free. It was super quick! πŸ€—

PS: I am aware of using Gradio to vibe code another Gradio! Pun very much intended here! πŸ˜‚
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @AdinaYakup: From paper to project page in one clickπŸš€

AnyCoder πŸ”₯ turns research PDFs into structured, shareable project pages in seconds!
https://huggingface.co/spaces/akhaliq/anycoder

Powered by 8 SoTA open models on @huggingface
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @vitrupo: Jack Dorsey says AI must be permissionless because constraint kills innovation.

Five CEOs shouldn't dictate what brings humanity forward.

Open source is the answer.

To protect ourselves, we have to race ahead. Eliminating single points of failure before they become civilization's choke points.
Hugging Face (Twitter)

RT @yagilb: I'm not sure how HF is paying for all those TBs going in and out, but at least now we're chipping in a little bit. Thanks @huggingface for being the great library of AI models for us all πŸ™
Hugging Face (Twitter)

RT @vllm_project: The @huggingface Transformers ↔️ @vllm_project integration just leveled up: Vision-Language Models are now supported out of the box!

If the model is integrated into Transformers, you can now run it directly with vLLM.

https://github.com/vllm-project/vllm/pull/20543

Great work @RTurganbay πŸ‘
Hugging Face (Twitter)

RT @itsPaulAi: Wait so Alibaba Qwen has just released ANOTHER model??

Qwen3-Coder is simply one of the best coding model we've ever seen.

β†’ Still 100% open source
β†’ Up to 1M context window πŸ”₯
β†’ 35B active parameters
β†’ Same performance as Sonnet 4

They're releasing a CLI tool as well ↓
Hugging Face (Twitter)

RT @AdinaYakup: Qwen3-Coder πŸ’» agentic code model by @Alibaba_Qwen

https://huggingface.co/collections/Qwen/qwen3-coder-687fc861e53c939e52d52d10

✨ 480B total, 35B activated MoE
✨ Agentic Coding + Browser Use β†’ Top code model performance
✨ 256K context (up to 1M via Yarn) for repo-scale understanding
Hugging Face (Twitter)

RT @cline: Live in Cline: Qwen3-235B-A22B-2507

Open source and with a 262k context window, the latest from Qwen is impressive on the benchmarks, but we're looking forward to its real world performance.

fyi, here's the model name in Cline:

qwen/qwen3-235b-a22b-07-25 https://twitter.com/Alibaba_Qwen/status/1947344511988076547#m
Hugging Face (Twitter)

RT @deedydas: The best open-source AI model just dropped a detailed report on how it was trained, a rare resource for students given no frontier lab is publishing!

Kimi K2's estimated total cost of training is ~$20-30M, roughly in line with pricing: $0.6/M in $2.5/M out tokens.

10 highlights:
Hugging Face (Twitter)

RT @Alibaba_Qwen: >>> Qwen3-Coder is here! βœ…

We’re releasing Qwen3-Coder-480B-A35B-Instruct, our most powerful open agentic code model to date. This 480B-parameter Mixture-of-Experts model (35B active) natively supports 256K context and scales to 1M context with extrapolation. It achieves top-tier performance across multiple agentic coding benchmarks among open models, including SWE-bench-Verified!!! πŸš€

Alongside the model, we're also open-sourcing a command-line tool for agentic coding: Qwen Code. Forked from Gemini Code, it includes custom prompts and function call protocols to fully unlock Qwen3-Coder’s capabilities. Qwen3-Coder works seamlessly with the community’s best developer tools. As a foundation model, we hope it can be used anywhere across the digital world β€” Agentic Coding in the World!

πŸ’¬ Chat: chat.qwen.ai/
πŸ“š Blog: https://qwenlm.github.io/blog/qwen3-coder/
πŸ€— Model: https://hf.co/Qwen/Qwen3-Coder-480B-A35B-Instruct
πŸ€– Qwen Code: github.com/QwenLM/qwen-code
Hugging Face (Twitter)

RT @itsPaulAi: Alibaba Qwen has just released a non-thinking model even more powerful than Kimi K2...

And even better than Claude Opus 4 🀯

β†’ 100% open source
β†’ Only 22B active parameters
β†’ Available for free in Qwen Chat

All the links below
Hugging Face (Twitter)

RT @reach_vb: Qwen on a rampage πŸ”₯ - Qwen3 Coder 480B (35B Active), beats Kimi K2 AND Claude Sonnet 4 - Apache 2.0 licensed!

> Upto 1M context
> Non-Reasoning
> Supports agentic and browser mode

So so excited for the smaller models in the series πŸ€—
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @reach_vb: NEW: Higgs Audio V2 from @boson_ai open, unified TTS model w/ voice cloning, beats GPT 4o mini tts and ElevenLabs v2 πŸ”₯

> Trained on 10M hours (speech, music, events)
> Built on top of Llama 3.2 3B
> Works real-time and on edge
> Beats GPT-4o-mini-tts, ElevenLabs v2 in prosody & emotion Multi-speaker dialog
> Zero-shot voice cloning 🀩

> Available on Hugging Face

Kudos to folks at Boson AI for releasing such a brilliant work and all the details around the model! πŸ€—