Hugging Face (Twitter)
RT @LoubnaBenAllal1: Our science team at @huggingface will be doing an AMA on r/LocalLLaMA tomorrow at 8AM PST (5PM CET). The team members behind SmolLM, SmolVLM, FineWeb, and more will be present to answer all your questions!
RT @LoubnaBenAllal1: Our science team at @huggingface will be doing an AMA on r/LocalLLaMA tomorrow at 8AM PST (5PM CET). The team members behind SmolLM, SmolVLM, FineWeb, and more will be present to answer all your questions!
Hugging Face (Twitter)
RT @Xianbao_QIAN: I'm very glad to see that the new translation model from @TencentHunyuan is now ranking the 3rd. It's a reminder that small domain tuned models are more valuable than they appears.
Agentic stack needs both large and small models. Large models can handle planning and leverage sub-agents based on lean models to perform a particular task. Small models are cheap, fast and fine-tunable. They're not the opposite of large models but the complement to it.
RT @Xianbao_QIAN: I'm very glad to see that the new translation model from @TencentHunyuan is now ranking the 3rd. It's a reminder that small domain tuned models are more valuable than they appears.
Agentic stack needs both large and small models. Large models can handle planning and leverage sub-agents based on lean models to perform a particular task. Small models are cheap, fast and fine-tunable. They're not the opposite of large models but the complement to it.
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @multimodalart: we hacked Wan 2.2 and discovered that it does first and last frame filling, works out of the box on 𧨠diffusers
i've built an app for it on @huggingface Spaces (which is powering powering our nano banana video mode too π π¬)
RT @multimodalart: we hacked Wan 2.2 and discovered that it does first and last frame filling, works out of the box on 𧨠diffusers
i've built an app for it on @huggingface Spaces (which is powering powering our nano banana video mode too π π¬)
Hugging Face (Twitter)
RT @QGallouedec: sept 4
8-11 am pst
@huggingface science team AMA
reddit r/LocalLlama
π½
RT @QGallouedec: sept 4
8-11 am pst
@huggingface science team AMA
reddit r/LocalLlama
π½
Hugging Face (Twitter)
RT @moby763canary21: I'm really glad that people are using my @huggingface model. It's really cool to contribute to Open ML!
#ai #machinelearning #huggingface @ClementDelangue
RT @moby763canary21: I'm really glad that people are using my @huggingface model. It's really cool to contribute to Open ML!
#ai #machinelearning #huggingface @ClementDelangue
Hugging Face (Twitter)
RT @lhoestq: "we made uploads to @huggingface using @ApacheSpark much faster than to any other cloud storage"
Spark is faster with Xet on Hugging Face for editing & publishing AI datasets π₯
I explained how it works hereπ
PS: it's π€―
PS2: thumb up and subπππ€π€π€
https://www.youtube.com/watch?v=vmwxVfye8fA?si=hp6Z3a28N0-bmZHF&t=2179
RT @lhoestq: "we made uploads to @huggingface using @ApacheSpark much faster than to any other cloud storage"
Spark is faster with Xet on Hugging Face for editing & publishing AI datasets π₯
I explained how it works hereπ
PS: it's π€―
PS2: thumb up and subπππ€π€π€
https://www.youtube.com/watch?v=vmwxVfye8fA?si=hp6Z3a28N0-bmZHF&t=2179
Hugging Face (Twitter)
RT @lvwerra: The Hugging Face research team is doing an AMA on r/LocalLlaMa tomorrow! π
Join if you are interested in:
> How did we get into the field? We cover a broad range of backgrounds and paths!
> How can you do impactful things while being more limited in resources than other labs?
> How do we decide which projects to work on when so many things are exciting?
> How does a fully remote team in a high velocity field even work?
> What's the most exciting thing coming in the next few months?
> What's your favourite optimizer and why is it Adam?
> How does Hugging Face make money?π€«
Or whatever else you want to ask - it's an AMA!
RT @lvwerra: The Hugging Face research team is doing an AMA on r/LocalLlaMa tomorrow! π
Join if you are interested in:
> How did we get into the field? We cover a broad range of backgrounds and paths!
> How can you do impactful things while being more limited in resources than other labs?
> How do we decide which projects to work on when so many things are exciting?
> How does a fully remote team in a high velocity field even work?
> What's the most exciting thing coming in the next few months?
> What's your favourite optimizer and why is it Adam?
> How does Hugging Face make money?π€«
Or whatever else you want to ask - it's an AMA!
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @victormustar: Wan 2.2: First frame β Last frame: Upload both as images to get excellent results.
Amazing what open-source AI video can do now π
β¬οΈ Demo available on Hugging Face
RT @victormustar: Wan 2.2: First frame β Last frame: Upload both as images to get excellent results.
Amazing what open-source AI video can do now π
β¬οΈ Demo available on Hugging Face
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @dylan_ebert_: HunyuanWorld-Voyager - Explorable 3D World Generation
πΉ World-consistent video diffusion
π Long-range world exploration
βοΈ Scalable data engine
available on Hugging Face
RT @dylan_ebert_: HunyuanWorld-Voyager - Explorable 3D World Generation
πΉ World-consistent video diffusion
π Long-range world exploration
βοΈ Scalable data engine
available on Hugging Face
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @LeRobotHF: π€ New arrivals at Hugging Face LeRobot! π€
We just got two fresh Unitree robots π€π, which means more robots will be added to the library π!
π Which additions would you like to see in LeRobot?
RT @LeRobotHF: π€ New arrivals at Hugging Face LeRobot! π€
We just got two fresh Unitree robots π€π, which means more robots will be added to the library π!
π Which additions would you like to see in LeRobot?
βHugging Face (Twitter)
RT @natolambert: Pretty big vibe shift coming from a predominantly AI Safety oriented org to say "gatekeeping access to general-purpose technology is not a sustainable or proportionate response to low-confidence evidence of serious risk."
Pretty much my point for a few years.
RT @natolambert: Pretty big vibe shift coming from a predominantly AI Safety oriented org to say "gatekeeping access to general-purpose technology is not a sustainable or proportionate response to low-confidence evidence of serious risk."
Pretty much my point for a few years.
vxTwitter / fixvx β’ See original tweet for full article
AI Frontiers (@aif_media)
Precaution Shouldn't Keep Open-Source AI Behind the Frontier
Invoking speculative risks to keep our most capable models behind paywalls could create a new form of digital feudalism.
Ben Brooks β August 31, 2025
This article originally appeared in AI Frontiers.β¦
Invoking speculative risks to keep our most capable models behind paywalls could create a new form of digital feudalism.
Ben Brooks β August 31, 2025
This article originally appeared in AI Frontiers.β¦
Hugging Face (Twitter)
RT @RisingSayak: You can now use flash-attention 3 through π€ `kernels`, skipping its long build times entirely π₯
Comes with full `torch.compile` support with fullgraph traceability.
Time to melt those hoppers!
RT @RisingSayak: You can now use flash-attention 3 through π€ `kernels`, skipping its long build times entirely π₯
Comes with full `torch.compile` support with fullgraph traceability.
Time to melt those hoppers!
βHugging Face (Twitter)
RT @vanstriendaniel: Blogged: Fine-tuning a VLM for art history in hours, not weeks
iconclass-vlm generates museum catalog codes (fun fact: "71H7131" = "Bathsheba with David's letter"!)
@huggingface TRL + Jobs = magic β¨
Guide here:
RT @vanstriendaniel: Blogged: Fine-tuning a VLM for art history in hours, not weeks
iconclass-vlm generates museum catalog codes (fun fact: "71H7131" = "Bathsheba with David's letter"!)
@huggingface TRL + Jobs = magic β¨
Guide here:
Daniel van Strien
Fine-tuning VLMs for Art History with TRL and HF Jobs
Train vision-language models to generate Iconclass metadata for artworks using TRLβs VLM support and cloud GPUs - no local setup needed!
βHugging Face (Twitter)
RT @thibaudfrere: FineVision is out! A massive open-source dataset by @huggingface for training Vision-Language Models:
- 17.3M images
- 24.3M samples
- 88.9M turns
- 9.5B answer tokens
This is the inaugural article using our new scientific publishing template!
https://huggingface.co/spaces/HuggingFaceM4/FineVision
RT @thibaudfrere: FineVision is out! A massive open-source dataset by @huggingface for training Vision-Language Models:
- 17.3M images
- 24.3M samples
- 88.9M turns
- 9.5B answer tokens
This is the inaugural article using our new scientific publishing template!
https://huggingface.co/spaces/HuggingFaceM4/FineVision
huggingface.co
FineVision: Open Data is All You Need - a Hugging Face Space by HuggingFaceM4
This application provides an open dataset to train Vision Language Models. Users can access and use this dataset to enhance their models' capabilities by integrating visual and textual data.
Hugging Face (Twitter)
RT @eliebakouch: We're starting the r/LocalLLaMa with a very nice line up NOW! We also have a nice new vision dataset release that you can us question about. π
Join us π€
RT @eliebakouch: We're starting the r/LocalLLaMa with a very nice line up NOW! We also have a nice new vision dataset release that you can us question about. π
Join us π€
Hugging Face (Twitter)
RT @ClementDelangue: Super happy to sign a partnership with @ESCP_bs, the oldest business school in the world, to give full access of Hugging Face to all 11,000 students and faculty!
It's particularly meaningful for me as I was a student there 15 years ago, with Dean Laulusa, who was part of my admission jury, that I had as a professor & recommended me for my first internship. I also started my first startup in the basement of the school with @BlueFactory_ by @MaevaTordo!
AI is going to change education so can't wait to see what ESCP students and faculty will build!
RT @ClementDelangue: Super happy to sign a partnership with @ESCP_bs, the oldest business school in the world, to give full access of Hugging Face to all 11,000 students and faculty!
It's particularly meaningful for me as I was a student there 15 years ago, with Dean Laulusa, who was part of my admission jury, that I had as a professor & recommended me for my first internship. I also started my first startup in the basement of the school with @BlueFactory_ by @MaevaTordo!
AI is going to change education so can't wait to see what ESCP students and faculty will build!
Hugging Face (Twitter)
RT @julien_c: Soon, those two will be powered by @LeRobotHF, our open source OS for Bipedal, quadrupedal, or single-arm/dual-arm robots! ‡οΈ
Thanks @UnitreeRobotics and @AdinaYakup
RT @julien_c: Soon, those two will be powered by @LeRobotHF, our open source OS for Bipedal, quadrupedal, or single-arm/dual-arm robots! ‡οΈ
Thanks @UnitreeRobotics and @AdinaYakup
Hugging Face (Twitter)
RT @lhoestq: AI engineers don't have to struggle to get the datasets ready for training anymore
1/ Prepare your raw data (from database/crawls) into AI ready datasets
2/ Publish on @huggingface and so your team can look at the data and train/eval easily
+uploads are crazy fast with Xet!
RT @lhoestq: AI engineers don't have to struggle to get the datasets ready for training anymore
1/ Prepare your raw data (from database/crawls) into AI ready datasets
2/ Publish on @huggingface and so your team can look at the data and train/eval easily
+uploads are crazy fast with Xet!
Hugging Face (Twitter)
RT @moby763canary21: I'm finally having my Twitter moment π
Thanks @ClementDelangue and @huggingface
RT @moby763canary21: I'm finally having my Twitter moment π
Thanks @ClementDelangue and @huggingface
Hugging Face (Twitter)
RT @reach_vb: We just added OpenAI Codex CLI formal support in Hugging Face MCP Server - go play with it now!! π₯
RT @reach_vb: We just added OpenAI Codex CLI formal support in Hugging Face MCP Server - go play with it now!! π₯
βHugging Face (Twitter)
RT @AdinaYakup: Latest update from @Kimi_Moonshot
Kimi K2 >>> Kimi K2-Instruct-0905π₯
https://huggingface.co/moonshotai/Kimi-K2-Instruct-0905
β¨ 32B activated / 1T total parameters
β¨ Enhanced agentic coding intelligence
β¨ Better frontend coding experience
β¨ 256K context window for long horizon tasks
RT @AdinaYakup: Latest update from @Kimi_Moonshot
Kimi K2 >>> Kimi K2-Instruct-0905π₯
https://huggingface.co/moonshotai/Kimi-K2-Instruct-0905
β¨ 32B activated / 1T total parameters
β¨ Enhanced agentic coding intelligence
β¨ Better frontend coding experience
β¨ 256K context window for long horizon tasks
huggingface.co
moonshotai/Kimi-K2-Instruct-0905 Β· Hugging Face
Weβre on a journey to advance and democratize artificial intelligence through open source and open science.