Hugging Face (Twitter)
RT @reach_vb: The best open model currently available on Inference Providers, blazing fast! Powered by @CerebrasSystems 🔥
Try it out today! https://twitter.com/reach_vb/status/1952782804023988557#m
RT @reach_vb: The best open model currently available on Inference Providers, blazing fast! Powered by @CerebrasSystems 🔥
Try it out today! https://twitter.com/reach_vb/status/1952782804023988557#m
Hugging Face (Twitter)
RT @ClementDelangue: When @sama told me at the AI summit in Paris that they were serious about releasing open-source models & asked what would be useful, I couldn’t believe it.
But six months of collaboration later, here it is: Welcome to OSS-GPT on @huggingface! It comes in two sizes, for both maximum reasoning capabilities & on-device, cheaper, faster option, all apache 2.0. It’s integrated with our inference partners that power the official demo.
This open-source release is critically important & timely, because as @WhiteHouse emphasized in the US Action plan, we need stronger American open-source AI foundations. And who could do that better than the very startup that has been pioneering and leading the field in so many ways.
Feels like a plot twist.
Feels like a comeback.
Feels like the beginning of something big, let’s go open-source AI 🔥🔥🔥
RT @ClementDelangue: When @sama told me at the AI summit in Paris that they were serious about releasing open-source models & asked what would be useful, I couldn’t believe it.
But six months of collaboration later, here it is: Welcome to OSS-GPT on @huggingface! It comes in two sizes, for both maximum reasoning capabilities & on-device, cheaper, faster option, all apache 2.0. It’s integrated with our inference partners that power the official demo.
This open-source release is critically important & timely, because as @WhiteHouse emphasized in the US Action plan, we need stronger American open-source AI foundations. And who could do that better than the very startup that has been pioneering and leading the field in so many ways.
Feels like a plot twist.
Feels like a comeback.
Feels like the beginning of something big, let’s go open-source AI 🔥🔥🔥
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @romainhuet: Today’s a big day! We have something really exciting to share with the open-source community.
We’re launching two open-weight language models: gpt-oss-120b and gpt-oss-20b.
They’re incredible models, built for developers, trained for reasoning, efficiency, and real-world use.🧵
RT @romainhuet: Today’s a big day! We have something really exciting to share with the open-source community.
We’re launching two open-weight language models: gpt-oss-120b and gpt-oss-20b.
They’re incredible models, built for developers, trained for reasoning, efficiency, and real-world use.🧵
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @dylan_ebert_: OpenAI just released GPT-OSS: An Open Source Language Model on Hugging Face
Open source meaning:
💸 Free
🔒 Private
🔧 Customizable
RT @dylan_ebert_: OpenAI just released GPT-OSS: An Open Source Language Model on Hugging Face
Open source meaning:
💸 Free
🔒 Private
🔧 Customizable
Hugging Face (Twitter)
RT @romainhuet: We built a gpt-oss developer playground so you can try the models right away:
• Choose your model and set the reasoning effort 🎛️
• See the model’s raw chain-of-thought for debugging and research 🧠
• Get a handful of free messages, and sign in with @huggingface for more 🤗
RT @romainhuet: We built a gpt-oss developer playground so you can try the models right away:
• Choose your model and set the reasoning effort 🎛️
• See the model’s raw chain-of-thought for debugging and research 🧠
• Get a handful of free messages, and sign in with @huggingface for more 🤗
Hugging Face (Twitter)
RT @OpenAI: Both gpt-oss models are free to download on Hugging Face, with native MXFP4 quantization built in for efficient deployment.
Full list of day-one support is available on our blog.
https://huggingface.co/collections/openai/gpt-oss-68911959590a1634ba11c7a4
RT @OpenAI: Both gpt-oss models are free to download on Hugging Face, with native MXFP4 quantization built in for efficient deployment.
Full list of day-one support is available on our blog.
https://huggingface.co/collections/openai/gpt-oss-68911959590a1634ba11c7a4
Hugging Face (Twitter)
RT @romainhuet: Partnering with @huggingface has been incredible.
We also worked with @ollama, @lmstudio, @vllm_project, so you can run models locally with your favorite tool on day one, and with many cloud and hardware partners so you can deploy it efficiently anywhere.
openai.com/open-models/
RT @romainhuet: Partnering with @huggingface has been incredible.
We also worked with @ollama, @lmstudio, @vllm_project, so you can run models locally with your favorite tool on day one, and with many cloud and hardware partners so you can deploy it efficiently anywhere.
openai.com/open-models/
Openai
Open models by OpenAI
Advanced open-weight reasoning models to customize for any use case and run anywhere.
Hugging Face (Twitter)
RT @ClementDelangue: And just like that, @OpenAI gpt-oss is now the number one trending model on @huggingface, out of almost 2M open models 🚀
People sometimes forget that they've already transformed the field: GPT-2, released back in 2019 is HF's most downloaded text-generation model ever, and Whisper has consistently ranked in the top 5 audio models.
Now that they are doubling down on openness, they may completely transform the AI ecosystem, again. Exciting times ahead!
RT @ClementDelangue: And just like that, @OpenAI gpt-oss is now the number one trending model on @huggingface, out of almost 2M open models 🚀
People sometimes forget that they've already transformed the field: GPT-2, released back in 2019 is HF's most downloaded text-generation model ever, and Whisper has consistently ranked in the top 5 audio models.
Now that they are doubling down on openness, they may completely transform the AI ecosystem, again. Exciting times ahead!
Hugging Face (Twitter)
RT @OpenAIDevs: Student credits for gpt-oss
With @huggingface, we’re offering 500 students $50 in inference credits to explore gpt-oss.
We hope these open models can help unlock new opportunities in class projects, research, fine-tuning, and more: tally.so/r/mKKdXX
RT @OpenAIDevs: Student credits for gpt-oss
With @huggingface, we’re offering 500 students $50 in inference credits to explore gpt-oss.
We hope these open models can help unlock new opportunities in class projects, research, fine-tuning, and more: tally.so/r/mKKdXX
Tally Forms
Open Models for Students & Research
Made with Tally, the simplest way to create forms.
Hugging Face (Twitter)
RT @nation_grok: NEWS: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
RT @nation_grok: NEWS: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
Hugging Face (Twitter)
RT @vectro: Quantized gpt-oss-20b on @huggingface
Made to run locally on your computer
RT @vectro: Quantized gpt-oss-20b on @huggingface
Made to run locally on your computer
Hugging Face (Twitter)
RT @vanstriendaniel: You can now generate synthetic data using @OpenAI's GPT OSS models on @huggingface Jobs!
One command, no setup:
hf jobs uv run --flavor l4x4 [script-url] \
--input-dataset your/dataset \
--output-dataset your/output
Works on L4 GPUs ⚡
https://huggingface.co/datasets/uv-scripts/openai-oss
RT @vanstriendaniel: You can now generate synthetic data using @OpenAI's GPT OSS models on @huggingface Jobs!
One command, no setup:
hf jobs uv run --flavor l4x4 [script-url] \
--input-dataset your/dataset \
--output-dataset your/output
Works on L4 GPUs ⚡
https://huggingface.co/datasets/uv-scripts/openai-oss
huggingface.co
uv-scripts/openai-oss · Datasets at Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @ArtificialAnlys: Link to question dataset on @huggingface:
https://huggingface.co/datasets/ArtificialAnalysis/AA-LCR
RT @ArtificialAnlys: Link to question dataset on @huggingface:
https://huggingface.co/datasets/ArtificialAnalysis/AA-LCR
huggingface.co
ArtificialAnalysis/AA-LCR · Datasets at Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @HaihaoShen: 🫡Probably the first INT4 GPT-OSS model: , should be friendly to most of the existing HWs. The next will be 120B INT4 model soon.
#intel #huggingface #openai #gpt #autoround
RT @HaihaoShen: 🫡Probably the first INT4 GPT-OSS model: , should be friendly to most of the existing HWs. The next will be 120B INT4 model soon.
#intel #huggingface #openai #gpt #autoround
huggingface.co
Intel/gpt-oss-20b-int4-rtn-AutoRound · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @MaziyarPanahi: 🚀 In under 10 hours, both gpt-oss models from @OpenAI are now trending at #1 and #2 on @huggingface!
Openness wins again.
@sama, can you feel the love tonight? 🎶
RT @MaziyarPanahi: 🚀 In under 10 hours, both gpt-oss models from @OpenAI are now trending at #1 and #2 on @huggingface!
Openness wins again.
@sama, can you feel the love tonight? 🎶
Hugging Face (Twitter)
RT @wjb_mattingly: Woot! First finetune of Dots.OCR on @huggingface ! Haven't done this since Qwen 2 VL. I'll be sharing the finetuning script tomorrow.
RT @wjb_mattingly: Woot! First finetune of Dots.OCR on @huggingface ! Haven't done this since Qwen 2 VL. I'll be sharing the finetuning script tomorrow.
Hugging Face (Twitter)
RT @cline: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
RT @cline: Live in Cline via @huggingface: gpt-oss-120b & gpt-oss-20b
> gpt-oss-120b is built for production-grade applications with high reasoning capabilities
> gpt-oss-20b is for lower latency needs and specialized local use cases
Also accessible via the Cline & OpenRouter providers
Hugging Face (Twitter)
RT @reach_vb: Want to run the latest @OpenAI gpt-oss models with Continuous Batching, Tensor Parallelism, Flash Attention 3 and more?
Checkout our detailed Inference and Fine-tuning recipes 🤗
RT @reach_vb: Want to run the latest @OpenAI gpt-oss models with Continuous Batching, Tensor Parallelism, Flash Attention 3 and more?
Checkout our detailed Inference and Fine-tuning recipes 🤗
Hugging Face (Twitter)
RT @ClementDelangue: Lots of conflicting takes about gpt-oss (yay open-source in the spotlight)!
We’re powering the official @openai demo gpt-oss.com with HF inference providers thanks to @FireworksAI_HQ, @CerebrasSystems, @GroqInc and @togethercompute so we have a front-row seat of what’s happening.
Something to remember: inference for new frontier open models isn’t easy, especially with a new format like harmony and the volume of interest that gpt-oss is getting out of the gate.
Early spikes can temporarily affect quality, accuracy, and overall "vibes," particularly just 24 hours post-release when providers are racing against the clock with barely any sleep!
Some advice to avoid forming the wrong opinions:
- If you care about getting vibes as fast as possible, use a hosted setup that gives you a diversity of providers like HF inference providers (https://huggingface.co/docs/inference-providers/guides/gpt-oss) and follow official standard...
Перейти на оригинальный пост
RT @ClementDelangue: Lots of conflicting takes about gpt-oss (yay open-source in the spotlight)!
We’re powering the official @openai demo gpt-oss.com with HF inference providers thanks to @FireworksAI_HQ, @CerebrasSystems, @GroqInc and @togethercompute so we have a front-row seat of what’s happening.
Something to remember: inference for new frontier open models isn’t easy, especially with a new format like harmony and the volume of interest that gpt-oss is getting out of the gate.
Early spikes can temporarily affect quality, accuracy, and overall "vibes," particularly just 24 hours post-release when providers are racing against the clock with barely any sleep!
Some advice to avoid forming the wrong opinions:
- If you care about getting vibes as fast as possible, use a hosted setup that gives you a diversity of providers like HF inference providers (https://huggingface.co/docs/inference-providers/guides/gpt-oss) and follow official standard...
Перейти на оригинальный пост