41.7K subscribers
5.53K photos
232 videos
5 files
917 links
πŸ€– Welcome to the ChatGPT telegram channel! Here, we post the latest news, updates, and examples of using the ChatGPT large language model for generating human-like text in conversations. Subscribe to stay up-to-date and learn more about its capabilities.
Download Telegram
Check out this 3 year old tool trained on GPT-2 data.

Work for you guys?

https://huggingface.co/openai-detector
parth007_96’s brilliant notes on reverse-engineering GitHub Copilot:

https://thakkarparth007.github.io/copilot-explorer/posts/copilot-internals
β€œbest prompts aren’t even plain text anymore, they’re increasingly code-centric themselves”
GPT-3/LLMs' Achilles heel is short context length - how many "in-context" examples they can consume to learn a new task.

Enter "Structured Prompting": scale your examples from dozens => 1,000+

Here's how:

=> Get 1000s of in-context samples

=> split them into M groups, each small enough to fit in regular context length

=> encode each of M groups using LLM encoder

=> combine these encoded groups and attend over a scaled version of the combination simultaneously

Paper: https://arxiv.org/pdf/2212.06713.pdf

Code: https://github.com/microsoft/LMOps
πŸ‘1
πŸ‘2
AI has a lying problem
🀣3😁1
AI Alignment
❀7πŸ‘Œ2πŸ‘1
AI Alignment
❀6πŸ‘Œ2
What will finally enable this?
πŸ‘4🀣2πŸ€”1
The student
Honor Roll student of the future.
the moment when teachers figure out they can grade 300 essays in 15 min with chatGPT
AI has a wokeness problem,
and a lying problem.
πŸ‘5😐1