41.6K subscribers
5.53K photos
232 videos
5 files
917 links
πŸ€– Welcome to the ChatGPT telegram channel! Here, we post the latest news, updates, and examples of using the ChatGPT large language model for generating human-like text in conversations. Subscribe to stay up-to-date and learn more about its capabilities.
Download Telegram
Chat GPT
πŸ›•Meme PyramidπŸ›•

Anyone interested in inviting to the group the best meme creators -- if you could later get a cut of any future cash prizes they win? 50% for 1st degree, 25% for 2nd degree, etc If yes, reply in the comments, and we'll build it.
πŸ›•MEME PYRAMID NOW LIVEπŸ›•

Invite people to @chadgptcoin, in order to get a cut of all their future rewards, along with a cut of future rewards from anyone they invite, along with a cut of future rewards from anyone they invite, etc.

Example:
β–Ά You invite bob to @chadgptcoin,
β–Ά If Bob later wins a daily meme contest you get 50% on top of what Bob gets,
β–Ά If Bob invites Frank, and later Frank wins, you get 25% on top of what Frank wins,
β–Ά etc.

Future rewards sources include:
β–Ά Winning the daily meme contest. See /memes in @chadgptcoin.
β–Ά Commissions earned from use of the upcoming Chad Wallet.
β–Ά Whatever else we come up with!

How to use?
β–Ά Just invite people to @chadgptcoin using the β€œ+Add” button on the group details page. That’s it.

πŸ›•MEME PYRAMID NOW LIVEπŸ›•
❀8πŸ‘5πŸ”₯4😁4πŸ₯°3πŸ‘3πŸŽ‰3🍾2😐1
Make a joke about ____________

OpenaAI Convo
🀬16🀣10❀1✍1😐1πŸ€“1
Got early access to GPT-5
🀣35😁11❀2πŸ‘2
make a joke about __________
🀬17🌚4❀2🀯2πŸ€“1
Well…
😐13😁2🌚2❀1πŸ‘1
Then they came for the song lyrics
🀬15❀1😒1
Nvidia Makes Nearly 1,000% Profit on H100 GPUs(!!!)

β€œ
Nvidia is raking in nearly 1,000% (about 823%) in profit percentage for each H100 GPU accelerator it sells, according to estimates made in a recent social media post from Barron's senior writer Tae Kim. In dollar terms, that means that Nvidia's street-price of around $25,000 to $30,000 for each of these High Performance Computing (HPC) accelerators (for the least-expensive PCIe version) more than covers the estimated $3,320 cost per chip and peripheral (in-board) components. As surfers will tell you, there's nothing quite like riding a wave with zero other boards on sight.”

Article
🀯11❀1πŸ‘1
Bing gets offended

who is better you or chat gpt?
😁18❀2πŸ‘2
Made a Nihilistic GPT powered chat bot that accused me of enjoying making him suffer and asked me to delete him.
😱15πŸ‘Œ3❀‍πŸ”₯1❀1πŸ‘1
I just solved the GPU shortage crisis
🀣37❀3
How would a Japanese person pronounce lululemon?
🀣15❀1πŸ’―1
Good news: OpenAI now allows fine-tuning on another model, other than just the ancient GPT-2 models

Bad news: It’s just the intentionally-crippled-for-cheapness GPT-3.5-turbo.

Not the original impressive GPT-3.5-legacy.

Not the now greatly-preferred GPT-4.

Just the crippled joke cheap β€œ3.5-turbo”, that almost never works for any real work or homework, and that they made just so they can cut costs on their free services.

OpenAI allowing fine-tuning on their current state-of-the-art model?

Not looking like that’s ever going to happen.

Wonder exactly why?

Announcement

Fine-tuning guide
🀬10❀2πŸ‘1
Widespread implicit belief: Only a matter of time before anyone will be able to create their own models competitive with OpenAI, due to advances in tech making it cheaper

β€œBTW, I don't see cost as an issue. Just as your smartphone chess program can beat the human world champion today, today's Deep Mind AI algorithms will run on your granddaughter's smart rings.”
❀8⚑1
Cost to Create a Competitive AI Model Over Time β€”

We don’t have to worry, because soon anyone will be able to afford to create their own OpenAI-competitive model at home. This is because, IN THE LONG RUN, technology advances always bring down costs.
Anonymous Poll
43%
TRUE: We DO NOT have to worry about the cost of competing with OpenAI, because soon it will be CHEAP
28%
FALSE: We do have to worry about the cost of competing with OpenAI, as costs will VASTLY INCREASE
29%
Show results
πŸ‘3❀1πŸ”₯1πŸ‘1
So… anyone got a theory on the thousands of simple β€œhow to reply” messages?

Or am I just overthinking this.
😁9πŸ‘2🀯2❀1πŸ‘Ύ1
Is higher energy consumption key to higher quality of life?

See the Quality if life indexes vs Energy per country.

Sure looks like the more energy per person expended, the happier we are.

(Remember that a large fraction of the per person expenditure occurs in factories e.g. to build your phone & data centers e.g. for your youtube and chatgpt.)

Is human appetite for energy consumption unbounded, limited only by what we can currently afford?
❀5πŸ‘4