Hugging Face
71 subscribers
735 photos
253 videos
1.25K links
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @skalskip92: supervision-0.26.0 is out

we finally released support for ViTPose and ViTPose++ pose estimation models from @huggingface transformers

link: https://github.com/roboflow/supervision
Hugging Face (Twitter)

RT @Xianbao_QIAN: Livestream of building HopeJR of your own.

111 people online. Almost 4000 likes after 5 hours of live streaming. Amazing!

Thanks @bilibili_en for the support.

Link below:
Hugging Face (Twitter)

RT @reach_vb: Lets GOOO! @NVIDIAAIDev just dropped Canary Qwen 2.5 - SoTA on Open ASR Leaderboard, CC-BY licensed πŸ”₯

> Works in both ASR and LLM mode (i.e. ask the model to summarise, QnA about the audio)
> Achieves the lowest 5.62 WER
> RTFx of 418 for a 2.5B model is impressive
> Commercially permissive license

Can even run on a free colab! Kudos Nvidia team - looking forward to multilingual versions soon! πŸ€—
Hugging Face (Twitter)

RT @MaziyarPanahi: πŸš€ Big news in healthcare AI! I'm thrilled to announce the launch of OpenMed on @huggingface, releasing 380+ state-of-the-art medical NER models for free under Apache 2.0.

And this is just the beginning! 🧡
Hugging Face (Twitter)

RT @togethercompute: Most AI benchmarks test the past.

But real intelligence is about predicting the future.

Introducing FutureBench β€” a new benchmark for evaluating agents on real forecasting tasks that we developed with @huggingface

πŸ” Reasoning > memorization
πŸ“Š Real-world events
🧠 Dynamic, verifiable outcomes

Read more (link below)
Hugging Face (Twitter)

RT @DataScienceHarp: Had an awesome time visiting @huggingface office in Paris! Thank you @mervenoyann for the invite. Good you finally meet you and the legend @reach_vb in person. Looking forward to the next time. Cheers!
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @pollenrobotics: πŸ–οΈ 4 fingers, 8 degrees of freedom
πŸ”© Dual hobby servos per finger
🦴 Rigid "bones" with a soft TPU shell
πŸ–¨οΈ Fully 3D printable
βš–οΈ Weighs 400g and costs under €200

This is the "Amazing Hand". Check it out πŸ‘‡

Try, tweak & share: https://huggingface.co/blog/pollen-robotics/amazing-hand
Hugging Face (Twitter)

RT @ErikKaum: We just released native support for @sgl_project and @vllm_project in Inference Endpoints πŸ”₯

Inference Endpoints is becoming the central place where you deploy high performance Inference Engines.

And that provides the managed infra for it so you can focus on your users.
Hugging Face (Twitter)

RT @pydantic: Pydantic AI now supports @huggingface as a provider!
You can use it to run open source models like DeepSeek R1 on scalable serverless infrastructure. They have a free tier allowance so you can test it out.

Thanks to the Hugging Face team (@hanouticelina ) for this great contribution.
Hugging Face (Twitter)

RT @ClementDelangue: It's so beautiful to see the @Kimi_Moonshot team participating in every single community discussions or pull requests on @huggingface (the little blue bubbles on the right).

In my opinion, every serious AI organization should dedicate meaningful time and ressources to this because that's how you build an engaged AI builder community!
Hugging Face (Twitter)

RT @reach_vb: You asked we delivered! Hugging Face Inference Providers is now fully OpenAI client compatible! πŸ”₯

Simply append the provider name to the model ID

OpenAI client is arguably the most used client when it comes to LLMs, so getting this right is a big milestone for the team! πŸ€—
Hugging Face (Twitter)

RT @calebfahlgren: The @huggingface Inference Providers is getting even easier to use! Now with a unified OpenAI client route.

Just use the model id and it works. You can also set your preferred provider with `:groq` for example.

Here's how easy it is to use @GroqInc and Kimi K2
This media is not supported in your browser
VIEW IN TELEGRAM
Hugging Face (Twitter)

RT @cline: πŸ€—πŸ€—πŸ€—
πŸ€—β€οΈπŸ€— @huggingface & Cline = your LLM playground
πŸ€—πŸ€—πŸ€—

You can access Kimi K2 & 6,140 (!) other open source models in Cline.
β€ŒHugging Face (Twitter)

RT @marimo_io: Announcing molab: a cloud-hosted marimo notebook workspace with link-based sharing.

Experiment on AI, ML and data using the world’s best Python (and SQL!) notebook.

Launching with examples from @huggingface, @weights_biases, and using @PyTorch

https://marimo.io/blog/announcing-molab
Hugging Face (Twitter)

RT @cline: Here's how you can use the @huggingface provider in Cline πŸ€—

(thread)