Disillusionment, Disbelief
Gartnerβs Hype Cycle, with its promise that all hype waves must be soon-after followed by a trough of disillusionment, is almost always taken as true.
Often, it is true.
But where does the trough prediction turn out to be a lie?
On tech that weβre all heavily using at this moment. General compute tech. Mooreβs law. 120 years and counting. Perfect ongoing exponential increase. No trough.
Can you guess where else it wonβt turn out to be true, with a trough that never comes?
Gartnerβs Hype Cycle, with its promise that all hype waves must be soon-after followed by a trough of disillusionment, is almost always taken as true.
Often, it is true.
But where does the trough prediction turn out to be a lie?
On tech that weβre all heavily using at this moment. General compute tech. Mooreβs law. 120 years and counting. Perfect ongoing exponential increase. No trough.
Can you guess where else it wonβt turn out to be true, with a trough that never comes?
β€2π2π―2
The Sparks of AGI have been Ignited
βIn this paper, we report on our investigation of an early version of GPT-4, when it was still in active development by OpenAI. We contend that (this early version of) GPT-4 is part of a new cohort of LLMs (along with ChatGPT and Google's PaLM for example) that exhibit more general intelligence than previous AI models. We discuss the rising capabilities and implications of these models. We demonstrate that, beyond its mastery of language, GPT-4 can solve novel and difficult tasks that span mathematics, coding, vision, medicine, law, psychology and more, without needing any special prompting. Moreover, in all of these tasks, GPT-4's performance is strikingly close to human-level performance, and often vastly surpasses prior models such as ChatGPT. Given the breadth and depth of GPT-4's capabilities, we believe that it could reasonably be viewed as an early (yet still incomplete) version of an artificial general intelligence (AGI) system.β
Paper: Sparks of Artificial General Intelligence: Early experiments with GPT-4
βIn this paper, we report on our investigation of an early version of GPT-4, when it was still in active development by OpenAI. We contend that (this early version of) GPT-4 is part of a new cohort of LLMs (along with ChatGPT and Google's PaLM for example) that exhibit more general intelligence than previous AI models. We discuss the rising capabilities and implications of these models. We demonstrate that, beyond its mastery of language, GPT-4 can solve novel and difficult tasks that span mathematics, coding, vision, medicine, law, psychology and more, without needing any special prompting. Moreover, in all of these tasks, GPT-4's performance is strikingly close to human-level performance, and often vastly surpasses prior models such as ChatGPT. Given the breadth and depth of GPT-4's capabilities, we believe that it could reasonably be viewed as an early (yet still incomplete) version of an artificial general intelligence (AGI) system.β
Paper: Sparks of Artificial General Intelligence: Early experiments with GPT-4
π₯10π2β€1
Nvidia: We Won't Sell to Companies That Use Generative AI To Do Harm'
Nvidia says it will stop selling GPUs to companies engaging in unethical AI projects.
βWe only sell to customers that do good,β Nvidia CEO Jensen Huang told journalists on Wednesday. βIf we believe that a customer is using our products to do harm, we would surely cut that off."
Nvidia's GPUs have played a pivotal role in developing ChatGPT, which is taking the world by storm. The AI-powered chatbot from OpenAI was reportedly trained using the help of tens of thousands of Nvidia A100 chips, which can individually cost around $10,000.
Article
Nvidia says it will stop selling GPUs to companies engaging in unethical AI projects.
βWe only sell to customers that do good,β Nvidia CEO Jensen Huang told journalists on Wednesday. βIf we believe that a customer is using our products to do harm, we would surely cut that off."
Nvidia's GPUs have played a pivotal role in developing ChatGPT, which is taking the world by storm. The AI-powered chatbot from OpenAI was reportedly trained using the help of tens of thousands of Nvidia A100 chips, which can individually cost around $10,000.
Article
π€‘24π10π4π€3π±3π2β€1π1π1
Produce TikZ code that draws a person composed from letters in the alphabet. The arms and torso can be the letter Y, the face can be the letter O (add some facial features) and the legs can be the legs of the letter H. Feel free to add other features.
The torso is a bit too long, the arms are too short and it looks like the right arm is carrying the face instead of the face being right above the torso. Could you correct this please?
Please add a shirt and pants.
The torso is a bit too long, the arms are too short and it looks like the right arm is carrying the face instead of the face being right above the torso. Could you correct this please?
Please add a shirt and pants.
π4β€1
Combining GPT-4 and stable diffusion
βHere, we explore the possibility of combining GPT-4 and existing image synthesis models by using the GPT-4 output as the sketch. As shown in Figure 2.8, this approach can produce images that have better quality and follow the instructions more closely than either model alone. We believe that this is a promising direction for leveraging the strengths of both GPT-4 and existing image synthesis models. It can also be viewed as a first example of giving GPT-4 access to tools,β
βHere, we explore the possibility of combining GPT-4 and existing image synthesis models by using the GPT-4 output as the sketch. As shown in Figure 2.8, this approach can produce images that have better quality and follow the instructions more closely than either model alone. We believe that this is a promising direction for leveraging the strengths of both GPT-4 and existing image synthesis models. It can also be viewed as a first example of giving GPT-4 access to tools,β
π7π₯4π2
Apple Neural Engine (ANE) Transformers: Transformer architecture optimized for Apple Silicon
PyTorch implementation for deploying your Transformer models on Apple devices with an A14 or newer and M1 or newer chip - to achieve up to 10 times faster and 14 times lower peak memory consumption compared to baseline implementations.
Research Article
Github
PyTorch implementation for deploying your Transformer models on Apple devices with an A14 or newer and M1 or newer chip - to achieve up to 10 times faster and 14 times lower peak memory consumption compared to baseline implementations.
Research Article
Github
π₯7π3β€1π1
Yeah, so crazy man, that OpenAI, who BANNED EVERYONE EXCEPT THEMSELVES from fine-tuning on their latest models, was the first to release a product that required fine-tuning on their latest models
Real mystery for the ages bro.
Weβd better ask ChatGPT for help with this incomprehensible logic puzzle.
Real mystery for the ages bro.
Weβd better ask ChatGPT for help with this incomprehensible logic puzzle.
π€£14π―5β€2π1π€―1π€¬1
This media is not supported in your browser
VIEW IN TELEGRAM
In the future you wonβt even have to press the buttons
π€―14π8π€£6π3π2πΏ2β€1π1