VoiceGPT is here!
2023 is going to be the year of AI, from Dall-E to ChatGPT🚀
VALL-E is an audio LLM that emerges in-context learning capabilities and can be used to synthesize high-quality personalized speech with only a 3-second enrolled recording of an unseen speaker as an acoustic prompt.
https://valle-demo.github.io/
2023 is going to be the year of AI, from Dall-E to ChatGPT🚀
VALL-E is an audio LLM that emerges in-context learning capabilities and can be used to synthesize high-quality personalized speech with only a 3-second enrolled recording of an unseen speaker as an acoustic prompt.
https://valle-demo.github.io/
👍1👀1
This media is not supported in your browser
VIEW IN TELEGRAM
Step 5.
Automate these steps.
Step 6.
Get in the habit of always first asking the AI what to do, even for things you usually wouldn’t even be able to solve yourself.
Step 7.
Life is better than ever, society booming.
Step 8.
Start to realize that something feels a little off, but the AI always reassures you that everything is just fine. No need to worry for now.
Step 9.
Oops.
Automate these steps.
Step 6.
Get in the habit of always first asking the AI what to do, even for things you usually wouldn’t even be able to solve yourself.
Step 7.
Life is better than ever, society booming.
Step 8.
Start to realize that something feels a little off, but the AI always reassures you that everything is just fine. No need to worry for now.
Step 9.
Oops.
👍2
Some guys seriously thinking ChatGPT can decode LZ4 compression 😂
But is it as crazy as it sounds? Mixed.
Gwern’s adds:
“A fun idea, but it seems like it ought to backfire: a Transformer model is feedforward, so it only has so many layers to 'think'; if it's spending those layers decoding compressed strings (which is amazing if it can), it doesn't have any 'time' to think about whatever abstract form it decodes the inputs to, and it definitely doesn't have time to combine a de facto huge context window to reason over it.
Now, a more interesting idea might be to see if you can train a Transformer on compressed strings to make it native, and maybe tradeoff some width for depth. There's some slight analogies in image generation to training on Fourier components (like JPEG) rather than pixels. But AFAIK no one has experimented with heavy duty compression inputs for regular natural language generation.”
But is it as crazy as it sounds? Mixed.
Gwern’s adds:
“A fun idea, but it seems like it ought to backfire: a Transformer model is feedforward, so it only has so many layers to 'think'; if it's spending those layers decoding compressed strings (which is amazing if it can), it doesn't have any 'time' to think about whatever abstract form it decodes the inputs to, and it definitely doesn't have time to combine a de facto huge context window to reason over it.
Now, a more interesting idea might be to see if you can train a Transformer on compressed strings to make it native, and maybe tradeoff some width for depth. There's some slight analogies in image generation to training on Fourier components (like JPEG) rather than pixels. But AFAIK no one has experimented with heavy duty compression inputs for regular natural language generation.”
👍4
AI Humor Gap:
Do Androids Laugh at Electric Sheep?
Humor “Understanding” Benchmarks from The New Yorker Caption Contest
“We demonstrate that today’s vision and language models still cannot recognize caption relevance, evaluate, or explain The New Yorker Caption Con- test as effectively as humans can”
https://arxiv.org/pdf/2209.06293.pdf
Do Androids Laugh at Electric Sheep?
Humor “Understanding” Benchmarks from The New Yorker Caption Contest
“We demonstrate that today’s vision and language models still cannot recognize caption relevance, evaluate, or explain The New Yorker Caption Con- test as effectively as humans can”
https://arxiv.org/pdf/2209.06293.pdf
👍2💩1
Media is too big
VIEW IN TELEGRAM
ChatGPT/GPT3 Siri Replacement
Who wants to talk to GPT3?
Here’s an Apple Shortcut template to speak your prompt, and have Siri read back completion.
You’ll need an iPhone, scriptableapp, and your own OpenAI api key (add to Script block).
OpenAI API key:
https://beta.openai.com/account/api-keys
Siri Shortcut:
https://www.icloud.com/shortcuts/08b02bf88171454387dd4e44539d575b
Who wants to talk to GPT3?
Here’s an Apple Shortcut template to speak your prompt, and have Siri read back completion.
You’ll need an iPhone, scriptableapp, and your own OpenAI api key (add to Script block).
OpenAI API key:
https://beta.openai.com/account/api-keys
Siri Shortcut:
https://www.icloud.com/shortcuts/08b02bf88171454387dd4e44539d575b
❤2👍2🔥2
This media is not supported in your browser
VIEW IN TELEGRAM
GPT-Siri
“tell me everything about large language models.”
“tell me everything about large language models.”
👍6
Want to try ChatGPT now?
To use:
1. Join the group
2. Type "/ask ___” for ChatGPT, or
3. Type “/chad ___” for CHAD GPT
https://t.iss.one/LLMchat
To use:
1. Join the group
2. Type "/ask ___” for ChatGPT, or
3. Type “/chad ___” for CHAD GPT
https://t.iss.one/LLMchat
👍8💩5🔥3❤1👏1🤬1🎉1😡1