Hugging Face
57 subscribers
580 photos
215 videos
1.01K links
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @jandotai: Hugging Face 🤝 Jan

You can now use Hugging Face as a remote model provider in Jan.

Go to Settings -> Model Providers -> add your Hugging Face API key. Then open a new chat and pick a model from @huggingface.

Works with any model in Hugging Face in Jan.
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @abidlabs: New Gradio component: 🥳 gr.Dialogue:

• As an output, it can be used to show diarized speech transcription
• As input, it's perfect for multispeaker TTS models, as it also supports auto-complete tags 🪄

Try it out in Gradio 5.40!
Hugging Face (Twitter)

RT @jackvial89: I've created a @LeRobotHF @huggingface dataset for the screwdriver robot. This dataset contains 391 human demonstrations of attaching a part with a screw in 3 positions: left, right, center. Currently training a few different models on this dataset!
Hugging Face (Twitter)

RT @RisingSayak: Wait is over 🤯

An Apache 2.0 DiT-based image generation model from @Alibaba_Qwen -- Qwen-Image 🔥

Supported in Diffusers. Training script PR is up and should be merged soon.

Go, fire!
Hugging Face (Twitter)

RT @romainhuet: A great day to be a developer! Stay tuned! 🤗
Hugging Face (Twitter)

RT @_lewtun: One line of code is all it takes to fine-tune the gpt-oss models from @OpenAI 🔥

> Support to target the MoE expert layers with PEFT
> Kernels for FlashAttention3 & MegaBlocks
> Fast inference with MXFP4 quantization format

In our testing, these models are extremely efficient to tune and can be adapted to new domains with just a few 100 samples 🤯

Download the models: huggingface.co/openai
Training & inference recipes: https://github.com/huggingface/gpt-oss-recipes/tree/main
Hugging Face (Twitter)

RT @mervenoyann: gpt-oss @OpenAI is here! 🔥

> two MoEs with 21B/3.6B and 117B/5.1B total/active params, efficient reasoning models 🤯
> use & fine-tune with transformers & TRL 🛠️
> inference powered by @huggingface Inference Providers 🫡
> apache 2.0 license 💗
Hugging Face (Twitter)

RT @multimodalart: the gpt-oss model is really easy to tune!

get started with customizing/fine-tuning to make gpt-oss your own with the @OpenAI + @huggingface cookbook 🤝

https://cookbook.openai.com/articles/gpt-oss/fine-tune-transfomers
Hugging Face (Twitter)

RT @reach_vb: OpenAI COOKED! That's an Apache 2.0 licensed 120B apache 2.0 licensed model competing with OpenAI O3 🤯

> 120B and 20B models
> 128K context
> First open model to be able to tool call in CoT
> Released with optimised kernels

Apache 2.0 license! What a landmark release - Kudos @OpenAIDevs 🤗
Hugging Face (Twitter)

RT @reach_vb: The best open model currently available on Inference Providers, blazing fast! Powered by @CerebrasSystems 🔥

Try it out today! https://twitter.com/reach_vb/status/1952782804023988557#m
Hugging Face (Twitter)

RT @ClementDelangue: When @sama told me at the AI summit in Paris that they were serious about releasing open-source models & asked what would be useful, I couldn’t believe it.

But six months of collaboration later, here it is: Welcome to OSS-GPT on @huggingface! It comes in two sizes, for both maximum reasoning capabilities & on-device, cheaper, faster option, all apache 2.0. It’s integrated with our inference partners that power the official demo.

This open-source release is critically important & timely, because as @WhiteHouse emphasized in the US Action plan, we need stronger American open-source AI foundations. And who could do that better than the very startup that has been pioneering and leading the field in so many ways.

Feels like a plot twist.
Feels like a comeback.
Feels like the beginning of something big, let’s go open-source AI 🔥🔥🔥
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @romainhuet: Today’s a big day! We have something really exciting to share with the open-source community.

We’re launching two open-weight language models: gpt-oss-120b and gpt-oss-20b.

They’re incredible models, built for developers, trained for reasoning, efficiency, and real-world use.🧵
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @dylan_ebert_: OpenAI just released GPT-OSS: An Open Source Language Model on Hugging Face

Open source meaning:
💸 Free
🔒 Private
🔧 Customizable
Hugging Face (Twitter)

RT @romainhuet: We built a gpt-oss developer playground so you can try the models right away:

• Choose your model and set the reasoning effort 🎛️
• See the model’s raw chain-of-thought for debugging and research 🧠
• Get a handful of free messages, and sign in with @huggingface for more 🤗
Hugging Face (Twitter)

RT @OpenAI: Both gpt-oss models are free to download on Hugging Face, with native MXFP4 quantization built in for efficient deployment.

Full list of day-one support is available on our blog.

https://huggingface.co/collections/openai/gpt-oss-68911959590a1634ba11c7a4
Hugging Face (Twitter)

RT @romainhuet: Partnering with @huggingface has been incredible.

We also worked with @ollama, @lmstudio, @vllm_project, so you can run models locally with your favorite tool on day one, and with many cloud and hardware partners so you can deploy it efficiently anywhere.
openai.com/open-models/
Hugging Face (Twitter)

RT @ClementDelangue: And just like that, @OpenAI gpt-oss is now the number one trending model on @huggingface, out of almost 2M open models 🚀

People sometimes forget that they've already transformed the field: GPT-2, released back in 2019 is HF's most downloaded text-generation model ever, and Whisper has consistently ranked in the top 5 audio models.

Now that they are doubling down on openness, they may completely transform the AI ecosystem, again. Exciting times ahead!