Hugging Face (Twitter)
RT @OpenAI: Both gpt-oss models are free to download on Hugging Face, with native MXFP4 quantization built in for efficient deployment.
Full list of day-one support is available on our blog.
https://huggingface.co/collections/openai/gpt-oss-68911959590a1634ba11c7a4
RT @OpenAI: Both gpt-oss models are free to download on Hugging Face, with native MXFP4 quantization built in for efficient deployment.
Full list of day-one support is available on our blog.
https://huggingface.co/collections/openai/gpt-oss-68911959590a1634ba11c7a4
Hugging Face (Twitter)
RT @romainhuet: Partnering with @huggingface has been incredible.
We also worked with @ollama, @lmstudio, @vllm_project, so you can run models locally with your favorite tool on day one, and with many cloud and hardware partners so you can deploy it efficiently anywhere.
openai.com/open-models/
RT @romainhuet: Partnering with @huggingface has been incredible.
We also worked with @ollama, @lmstudio, @vllm_project, so you can run models locally with your favorite tool on day one, and with many cloud and hardware partners so you can deploy it efficiently anywhere.
openai.com/open-models/
Openai
Open models by OpenAI
Advanced open-weight reasoning models to customize for any use case and run anywhere.
Hugging Face (Twitter)
RT @ClementDelangue: And just like that, @OpenAI gpt-oss is now the number one trending model on @huggingface, out of almost 2M open models 🚀
People sometimes forget that they've already transformed the field: GPT-2, released back in 2019 is HF's most downloaded text-generation model ever, and Whisper has consistently ranked in the top 5 audio models.
Now that they are doubling down on openness, they may completely transform the AI ecosystem, again. Exciting times ahead!
RT @ClementDelangue: And just like that, @OpenAI gpt-oss is now the number one trending model on @huggingface, out of almost 2M open models 🚀
People sometimes forget that they've already transformed the field: GPT-2, released back in 2019 is HF's most downloaded text-generation model ever, and Whisper has consistently ranked in the top 5 audio models.
Now that they are doubling down on openness, they may completely transform the AI ecosystem, again. Exciting times ahead!
Hugging Face (Twitter)
RT @OpenAIDevs: Student credits for gpt-oss
With @huggingface, we’re offering 500 students $50 in inference credits to explore gpt-oss.
We hope these open models can help unlock new opportunities in class projects, research, fine-tuning, and more: tally.so/r/mKKdXX
RT @OpenAIDevs: Student credits for gpt-oss
With @huggingface, we’re offering 500 students $50 in inference credits to explore gpt-oss.
We hope these open models can help unlock new opportunities in class projects, research, fine-tuning, and more: tally.so/r/mKKdXX
Tally Forms
Open Models for Students & Research
Made with Tally, the simplest way to create forms.
Hugging Face (Twitter)
RT @nation_grok: NEWS: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
RT @nation_grok: NEWS: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
Hugging Face (Twitter)
RT @vectro: Quantized gpt-oss-20b on @huggingface
Made to run locally on your computer
RT @vectro: Quantized gpt-oss-20b on @huggingface
Made to run locally on your computer
Hugging Face (Twitter)
RT @vanstriendaniel: You can now generate synthetic data using @OpenAI's GPT OSS models on @huggingface Jobs!
One command, no setup:
hf jobs uv run --flavor l4x4 [script-url] \
--input-dataset your/dataset \
--output-dataset your/output
Works on L4 GPUs ⚡
https://huggingface.co/datasets/uv-scripts/openai-oss
RT @vanstriendaniel: You can now generate synthetic data using @OpenAI's GPT OSS models on @huggingface Jobs!
One command, no setup:
hf jobs uv run --flavor l4x4 [script-url] \
--input-dataset your/dataset \
--output-dataset your/output
Works on L4 GPUs ⚡
https://huggingface.co/datasets/uv-scripts/openai-oss
huggingface.co
uv-scripts/openai-oss · Datasets at Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @ArtificialAnlys: Link to question dataset on @huggingface:
https://huggingface.co/datasets/ArtificialAnalysis/AA-LCR
RT @ArtificialAnlys: Link to question dataset on @huggingface:
https://huggingface.co/datasets/ArtificialAnalysis/AA-LCR
huggingface.co
ArtificialAnalysis/AA-LCR · Datasets at Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @HaihaoShen: 🫡Probably the first INT4 GPT-OSS model: , should be friendly to most of the existing HWs. The next will be 120B INT4 model soon.
#intel #huggingface #openai #gpt #autoround
RT @HaihaoShen: 🫡Probably the first INT4 GPT-OSS model: , should be friendly to most of the existing HWs. The next will be 120B INT4 model soon.
#intel #huggingface #openai #gpt #autoround
huggingface.co
Intel/gpt-oss-20b-int4-rtn-AutoRound · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @MaziyarPanahi: 🚀 In under 10 hours, both gpt-oss models from @OpenAI are now trending at #1 and #2 on @huggingface!
Openness wins again.
@sama, can you feel the love tonight? 🎶
RT @MaziyarPanahi: 🚀 In under 10 hours, both gpt-oss models from @OpenAI are now trending at #1 and #2 on @huggingface!
Openness wins again.
@sama, can you feel the love tonight? 🎶
Hugging Face (Twitter)
RT @wjb_mattingly: Woot! First finetune of Dots.OCR on @huggingface ! Haven't done this since Qwen 2 VL. I'll be sharing the finetuning script tomorrow.
RT @wjb_mattingly: Woot! First finetune of Dots.OCR on @huggingface ! Haven't done this since Qwen 2 VL. I'll be sharing the finetuning script tomorrow.
Hugging Face (Twitter)
RT @cline: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
RT @cline: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
Hugging Face (Twitter)
RT @reach_vb: Want to run the latest @OpenAI gpt-oss models with Continuous Batching, Tensor Parallelism, Flash Attention 3 and more?
Checkout our detailed Inference and Fine-tuning recipes 🤗
RT @reach_vb: Want to run the latest @OpenAI gpt-oss models with Continuous Batching, Tensor Parallelism, Flash Attention 3 and more?
Checkout our detailed Inference and Fine-tuning recipes 🤗
Hugging Face (Twitter)
RT @ClementDelangue: Lots of conflicting takes about gpt-oss (yay open-source in the spotlight)!
We’re powering the official @openai demo gpt-oss.com with HF inference providers thanks to @FireworksAI_HQ, @CerebrasSystems, @GroqInc and @togethercompute so we have a front-row seat of what’s happening.
Something to remember: inference for new frontier open models isn’t easy, especially with a new format like harmony and the volume of interest that gpt-oss is getting out of the gate.
Early spikes can temporarily affect quality, accuracy, and overall "vibes," particularly just 24 hours post-release when providers are racing against the clock with barely any sleep!
Some advice to avoid forming the wrong opinions:
- If you care about getting vibes as fast as possible, use a hosted setup that gives you a diversity of providers like HF inference providers (https://huggingface.co/docs/inference-providers/guides/gpt-oss) and follow official standard...
Перейти на оригинальный пост
RT @ClementDelangue: Lots of conflicting takes about gpt-oss (yay open-source in the spotlight)!
We’re powering the official @openai demo gpt-oss.com with HF inference providers thanks to @FireworksAI_HQ, @CerebrasSystems, @GroqInc and @togethercompute so we have a front-row seat of what’s happening.
Something to remember: inference for new frontier open models isn’t easy, especially with a new format like harmony and the volume of interest that gpt-oss is getting out of the gate.
Early spikes can temporarily affect quality, accuracy, and overall "vibes," particularly just 24 hours post-release when providers are racing against the clock with barely any sleep!
Some advice to avoid forming the wrong opinions:
- If you care about getting vibes as fast as possible, use a hosted setup that gives you a diversity of providers like HF inference providers (https://huggingface.co/docs/inference-providers/guides/gpt-oss) and follow official standard...
Перейти на оригинальный пост
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @dylan_ebert_: @dylan_ebert_: Hugging Face Explained in 45 seconds https://twitter.com/hamiltonsucks76/status/1952813334102983029#m
RT @dylan_ebert_: @dylan_ebert_: Hugging Face Explained in 45 seconds https://twitter.com/hamiltonsucks76/status/1952813334102983029#m
Hugging Face (Twitter)
RT @elonmusk: @BasedBeffJezos: It’s high time we open sourced Grok 2. Will make it happen next week.
We’ve just been fighting fires and burning the 4am oil nonstop for a while now.
RT @elonmusk: @BasedBeffJezos: It’s high time we open sourced Grok 2. Will make it happen next week.
We’ve just been fighting fires and burning the 4am oil nonstop for a while now.
Hugging Face (Twitter)
RT @romainhuet: Great perspective and tips from @ClementDelangue on the early days of gpt-oss.
Building strong foundations take time, and we’re excited to be working with inference providers and @huggingface to help these models perform at their best. Please keep your feedback coming! https://twitter.com/ClementDelangue/status/1953119901649891367#m
RT @romainhuet: Great perspective and tips from @ClementDelangue on the early days of gpt-oss.
Building strong foundations take time, and we’re excited to be working with inference providers and @huggingface to help these models perform at their best. Please keep your feedback coming! https://twitter.com/ClementDelangue/status/1953119901649891367#m
Hugging Face (Twitter)
RT @romainhuet: Both gpt-oss models are trending #1 and #2 among 2M models on @huggingface! 🤗 Thanks to the open-source AI community for your support since launch.
We’re following discussions and will pop in when we can—feel free to ask questions, share ideas, and show what you’re building!
RT @romainhuet: Both gpt-oss models are trending #1 and #2 among 2M models on @huggingface! 🤗 Thanks to the open-source AI community for your support since launch.
We’re following discussions and will pop in when we can—feel free to ask questions, share ideas, and show what you’re building!