Hugging Face (Twitter)
RT @crystalsssup: Thanks! It's a gift we've been preparing for the community for over half a year. We'll keep working hard — more to come! 🙌 https://twitter.com/huggingface/status/1944155602583691492#m
RT @crystalsssup: Thanks! It's a gift we've been preparing for the community for over half a year. We'll keep working hard — more to come! 🙌 https://twitter.com/huggingface/status/1944155602583691492#m
Hugging Face (Twitter)
RT @xeophon_: SmolLM3 is a great model, might replace Qwen3 4B for me
congrats @eliebakouch :)
RT @xeophon_: SmolLM3 is a great model, might replace Qwen3 4B for me
congrats @eliebakouch :)
Hugging Face (Twitter)
RT @NielsRogge: This is all the code you need to get started with @Kimi_Moonshot Kimi K2 btw
Powered by @huggingface Inference Providers and @novita_labs
RT @NielsRogge: This is all the code you need to get started with @Kimi_Moonshot Kimi K2 btw
Powered by @huggingface Inference Providers and @novita_labs
Hugging Face (Twitter)
RT @caleb_joye: My new best friend 🤗 @huggingface Inference Provider End Points are AMAZING! If you're looking to satisfy thirsty generative AI customers or develop content, this is the best deal I've seen! 🎉
https://huggingface.co/docs/inference-providers/index
RT @caleb_joye: My new best friend 🤗 @huggingface Inference Provider End Points are AMAZING! If you're looking to satisfy thirsty generative AI customers or develop content, this is the best deal I've seen! 🎉
https://huggingface.co/docs/inference-providers/index
Hugging Face (Twitter)
RT @IlirAliu_: Star Wars showed them to you in ‘77.
You grew up watching robots on screen.
But what if you could actually buy one?
For your lab. For your classroom.
For your... kid.
A French startup made it real, and affordable:🧵
RT @IlirAliu_: Star Wars showed them to you in ‘77.
You grew up watching robots on screen.
But what if you could actually buy one?
For your lab. For your classroom.
For your... kid.
A French startup made it real, and affordable:🧵
Hugging Face (Twitter)
RT @Teknium1: We have not had any innovation on training dataset inspection/viewers since @lilac_ai disbanded into Mosaics aquisition, so I'm very happy @huggingface is taking up the mantle to get us back to there in modern times https://twitter.com/calebfahlgren/status/1943708053699748077#m
RT @Teknium1: We have not had any innovation on training dataset inspection/viewers since @lilac_ai disbanded into Mosaics aquisition, so I'm very happy @huggingface is taking up the mantle to get us back to there in modern times https://twitter.com/calebfahlgren/status/1943708053699748077#m
Hugging Face (Twitter)
RT @vanstriendaniel: Google Drive is great for many things — sharing research datasets isn’t one of them.
If your dataset isn’t on the @huggingface Hub yet, LLMs can now help. Inspired by @jeremyphoward’s llms.txt, we’ve made a guide to help LLMs convert your data to Hub format.
RT @vanstriendaniel: Google Drive is great for many things — sharing research datasets isn’t one of them.
If your dataset isn’t on the @huggingface Hub yet, LLMs can now help. Inspired by @jeremyphoward’s llms.txt, we’ve made a guide to help LLMs convert your data to Hub format.
Hugging Face (Twitter)
RT @reach_vb: LOVE ITT! You can run Kimi K2 (1T token MoE) on a single M4 Max 128GB VRAM (w/ offloading) or a single M3 Ultra (512GB) 🔥
The model was released less than 72 hours ago - love how fast the community optimises open weights - kudos to @UnslothAI 🤗
https://huggingface.co/unsloth/Kimi-K2-Instruct-GGUF
RT @reach_vb: LOVE ITT! You can run Kimi K2 (1T token MoE) on a single M4 Max 128GB VRAM (w/ offloading) or a single M3 Ultra (512GB) 🔥
The model was released less than 72 hours ago - love how fast the community optimises open weights - kudos to @UnslothAI 🤗
https://huggingface.co/unsloth/Kimi-K2-Instruct-GGUF
huggingface.co
unsloth/Kimi-K2-Instruct-GGUF · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @mervenoyann: past week had huuuge releases, here's our picks 🔥
> moonshot released Kimi K2, sota LLM with 1T total 32B active parameters 🤯
> @huggingface released SmolLM3-3B, best LM for it's size, offers thinking mode 💭 as well as the dataset, smoltalk2
> Alibaba released WebSailor-3B, agentic LLM for complex browsing
> Google DeepMind released medical vision LMs MedGemma & MedSigLIP with an agentic doctor-patient app
> fal released a LoRA to improve details on face images
find link on the next one for more releases 🙏🏻
RT @mervenoyann: past week had huuuge releases, here's our picks 🔥
> moonshot released Kimi K2, sota LLM with 1T total 32B active parameters 🤯
> @huggingface released SmolLM3-3B, best LM for it's size, offers thinking mode 💭 as well as the dataset, smoltalk2
> Alibaba released WebSailor-3B, agentic LLM for complex browsing
> Google DeepMind released medical vision LMs MedGemma & MedSigLIP with an agentic doctor-patient app
> fal released a LoRA to improve details on face images
find link on the next one for more releases 🙏🏻
Hugging Face (Twitter)
RT @ArtificialAnlys: Kimi k2 compared to other models:
https://artificialanalysis.ai/models/kimi-k2
Kimi k2 provider benchmarks:
https://artificialanalysis.ai/models/kimi-k2/providers
Link to weights on @huggingface:
https://huggingface.co/moonshotai/Kimi-K2-Instruct
https://artificialanalysis.ai/models/kimi-k2
RT @ArtificialAnlys: Kimi k2 compared to other models:
https://artificialanalysis.ai/models/kimi-k2
Kimi k2 provider benchmarks:
https://artificialanalysis.ai/models/kimi-k2/providers
Link to weights on @huggingface:
https://huggingface.co/moonshotai/Kimi-K2-Instruct
https://artificialanalysis.ai/models/kimi-k2
artificialanalysis.ai
Kimi K2 - Intelligence, Performance & Price Analysis | Artificial Analysis
Analysis of Moonshot AI's Kimi K2 and comparison to other AI models across key metrics including quality, price, performance (tokens per second & time to first token), context window & more.
Hugging Face (Twitter)
RT @UnslothAI: You can now run Kimi K2 locally with our Dynamic 1.8-bit GGUFs!
We shrank the full 1.1TB model to just 245GB (-80% size reduction).
The 2-bit XL GGUF performs exceptionally well on coding & passes all our code tests
Guide: https://docs.unsloth.ai/basics/kimi-k2
GGUFs: https://huggingface.co/unsloth/Kimi-K2-Instruct-GGUF
RT @UnslothAI: You can now run Kimi K2 locally with our Dynamic 1.8-bit GGUFs!
We shrank the full 1.1TB model to just 245GB (-80% size reduction).
The 2-bit XL GGUF performs exceptionally well on coding & passes all our code tests
Guide: https://docs.unsloth.ai/basics/kimi-k2
GGUFs: https://huggingface.co/unsloth/Kimi-K2-Instruct-GGUF
Hugging Face (Twitter)
RT @LechMazur: Open-weight model Kimi K2 by Alibaba-backed startup Moonshot is the new Short-Story Creative Writing champion 🏆! With a score of 8.56, it overtakes former champion o3-pro (8.44).
Additionally, Baidu Ernie 4.5 300B A47B was added, scoring 8.00.
RT @LechMazur: Open-weight model Kimi K2 by Alibaba-backed startup Moonshot is the new Short-Story Creative Writing champion 🏆! With a score of 8.56, it overtakes former champion o3-pro (8.44).
Additionally, Baidu Ernie 4.5 300B A47B was added, scoring 8.00.
Hugging Face (Twitter)
RT @NERDDISCO: when i opened my @huggingface account in 2023, i dreamed of giving something back to this amazing ai community...
today’s the day!
just published my first community post: LeRobot.js
https://huggingface.co/blog/NERDDISCO/lerobotjs https://twitter.com/NERDDISCO/status/1941284773617598688#m
RT @NERDDISCO: when i opened my @huggingface account in 2023, i dreamed of giving something back to this amazing ai community...
today’s the day!
just published my first community post: LeRobot.js
https://huggingface.co/blog/NERDDISCO/lerobotjs https://twitter.com/NERDDISCO/status/1941284773617598688#m
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)
RT @adcock_brett: Hugging Face opened pre-orders for Reachy Mini, an expressive, open-source desktop robot
Starting at $299, the robot is designed for human-robot interaction, creative coding, and AI experimentation
And it's fully programmable in Python
RT @adcock_brett: Hugging Face opened pre-orders for Reachy Mini, an expressive, open-source desktop robot
Starting at $299, the robot is designed for human-robot interaction, creative coding, and AI experimentation
And it's fully programmable in Python
Hugging Face (Twitter)
RT @LynaZhang: 🚀Our rStar-Coder dataset is now released!
A verified dataset of 418K competition-level code problems, each with test cases of varying difficulty. On LiveCodeBench, it boosts Qwen2.5-14B from 23.3% → 62.5%, beating o3-mini (low) by +3.1%.
Try it here:
RT @LynaZhang: 🚀Our rStar-Coder dataset is now released!
A verified dataset of 418K competition-level code problems, each with test cases of varying difficulty. On LiveCodeBench, it boosts Qwen2.5-14B from 23.3% → 62.5%, beating o3-mini (low) by +3.1%.
Try it here:
huggingface.co
microsoft/rStar-Coder · Datasets at Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Hugging Face (Twitter)
RT @EnricoShippole: We open-sourced 99% of US caselaw on @huggingface. Both AI and legal tech companies are selling this data for a high premium. You can simply just build a wrapper around it and freely compete with them now. That is why we love open-source.
https://twitter.com/intellectronica/status/1944792410124648532#m
RT @EnricoShippole: We open-sourced 99% of US caselaw on @huggingface. Both AI and legal tech companies are selling this data for a high premium. You can simply just build a wrapper around it and freely compete with them now. That is why we love open-source.
https://twitter.com/intellectronica/status/1944792410124648532#m
huggingface.co
common-pile/caselaw_access_project · Datasets at Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
❤1
Hugging Face (Twitter)
RT @Xianbao_QIAN: Model hub -> Application (space) hub -> Papers hub
What's next big hub on HF?
> Kernel hub <
RT @Xianbao_QIAN: Model hub -> Application (space) hub -> Papers hub
What's next big hub on HF?
> Kernel hub <
vxTwitter / fixvx
💖 7
💖 7
Arthur Zucker (@art_zucker)
We already have a solution for kernel install issues, in transformers you can hotswap with this: https://huggingface.co/kernels-community/flash-attn3, its a single install, very light (200mb?) because matches only your hardware, and... will support metal…
Hugging Face (Twitter)
RT @ZeffMax: I spoke to Hugging Face cofounder @Thom_Wolf about the Reachy Mini, and the company's bet on cute, desktop robotic devices to bring open source AI models into people's homes.
TBH, I think the Reachy Mini is one of a few AI devices people I know are really excited about.
RT @ZeffMax: I spoke to Hugging Face cofounder @Thom_Wolf about the Reachy Mini, and the company's bet on cute, desktop robotic devices to bring open source AI models into people's homes.
TBH, I think the Reachy Mini is one of a few AI devices people I know are really excited about.
Hugging Face (Twitter)
RT @rohanpaul_ai: Did you know 99% of US caselaw are available open sourced on @huggingface .😯
This dataset contains 6.7 million cases from the Caselaw Access Project and Court Listener.
The Caselaw Access Project consists of nearly 40 million pages of U.S. federal and state court decisions and judges’ opinions from the last 365 years.
In addition, Court Listener adds over 900 thousand cases scraped from 479 courts.
The Caselaw Access Project and Court Listener source legal data from a wide variety of resources such as the Harvard Law Library, the Law Library of Congress, and the Supreme Court Database.
From these sources, this dataset only included documents that were in the public domain.
Erroneous OCR errors were further corrected after digitization, and additional post-processing was done to fix formatting and parsing. https://twitter.com/EnricoShippole/status/1945129974375039226#m
RT @rohanpaul_ai: Did you know 99% of US caselaw are available open sourced on @huggingface .😯
This dataset contains 6.7 million cases from the Caselaw Access Project and Court Listener.
The Caselaw Access Project consists of nearly 40 million pages of U.S. federal and state court decisions and judges’ opinions from the last 365 years.
In addition, Court Listener adds over 900 thousand cases scraped from 479 courts.
The Caselaw Access Project and Court Listener source legal data from a wide variety of resources such as the Harvard Law Library, the Law Library of Congress, and the Supreme Court Database.
From these sources, this dataset only included documents that were in the public domain.
Erroneous OCR errors were further corrected after digitization, and additional post-processing was done to fix formatting and parsing. https://twitter.com/EnricoShippole/status/1945129974375039226#m