41.5K subscribers
5.53K photos
232 videos
5 files
917 links
🤖 Welcome to the ChatGPT telegram channel! Here, we post the latest news, updates, and examples of using the ChatGPT large language model for generating human-like text in conversations. Subscribe to stay up-to-date and learn more about its capabilities.
Download Telegram
Hi
🤬111
Thanks ChatGPT

It’s a snek
🤣9🌭73👍1
“ChatGPT Addiction”
😐108👍3😨1
ChatGPT corrects itself in the middle of generating a message.

Write me an integral that solves to 69
👍269👏6💯4🕊1
Didn't know it would have so many questions for me 😳
😁211
Trying ERNIE, China's ChatGPT, created to push Chinese values

* Seems to copy-paste hard-coded official sources whenever certain topics are mentioned, even if those don’t really answer the question

* Mediocre language abilities, even in Chinese, for now

* Mediocre reasoning abilities, for now

* Hilarious image drawing results

Article
👍9😁61
Stanford DSPy: The framework for programming with foundation models

DSPy introduces an automatic compiler that teaches LMs how to conduct the declarative steps in your program. Specifically, the DSPy compiler will internally trace your program and then craft high-quality prompts for large LMs (or train automatic finetunes for small LMs) to teach them the steps of your task.”

Github
👍74🔥3
Coming soon to LLMs

And everything else.
😁17😱31👏1💯1
Russia Enters the AI LLM Foundation Model Race

“Putin has ordered the government to “implement measures” to support AI research, including by “providing for an annual allocation from the federal budget”.

“This research would include “optimising machine learning algorithms” as well as developing “large language models” – such as such as the one developed by OpenAI.”

“Putin has repeatedly called for Russia to achieve what he calls “technological sovereignty”, as Western sanctions over the conflict in Ukraine block Moscow from getting computer parts such as semiconductors.”

Article
🤣1612👀10💊4👍2❤‍🔥1🗿1
Who will win: Smartness of humans vs smartness of machines
🫡1912👍6😱6🔥1
This media is not supported in your browser
VIEW IN TELEGRAM
Running a 180 billion parameter LLM on a single Apple M2 Ultra

GPT-4 is ~1.5 trillion parameters.
20🥰4🔥2😱1
No one suspects that model sizes could be steadily increasing, at a rate much faster than Moore’s Law can catch up

Which would mean that the dream of running your own modern LLMs at home would be put further and further out of reach, with each passing year, forever.

No one is bothering to look at the numbers.
💯14😱5💊32👍1