Headings and Subheadings - ChatGPT Cheat Sheet
convert this text into headings and subheadings: Babe Ruth joined the New York Yankees in 1920. The Boston Red Sox sold his contract to the Yankees. He played for the Yankees from 1920 to 1934, and during that time he established himself as one of the greatest players in baseball history.
convert this text into headings and subheadings: Babe Ruth joined the New York Yankees in 1920. The Boston Red Sox sold his contract to the Yankees. He played for the Yankees from 1920 to 1934, and during that time he established himself as one of the greatest players in baseball history.
👍1
Correct ChatGPT on Its Knowledge (just for remainder of the current session) - ChatGPT Cheat Sheet
what is deepsparse
this is deepsparse: DeepSparse
An inference runtime offering GPU-class performance on CPUs and APIs to integrate ML into your application
Documentation Slack Support Main GitHub release Contributor Covenant YouTube Medium Twitter
A CPU runtime that takes advantage of sparsity within neural networks to reduce compute.
Read more about sparsification.
Neural Magic's DeepSparse is able to integrate into popular deep learning libraries (e.g.
Hugging Face, Ultralytics) allowing you to leverage DeepSparse for loading and deploying sparse models with ONNX. ONNX gives the flexibility to serve your model in a framework-agnostic environment. Support includes PyTorch, TensorFlow, Keras, and many other trameworks.
what is deepsparse
this is deepsparse: DeepSparse
An inference runtime offering GPU-class performance on CPUs and APIs to integrate ML into your application
Documentation Slack Support Main GitHub release Contributor Covenant YouTube Medium Twitter
A CPU runtime that takes advantage of sparsity within neural networks to reduce compute.
Read more about sparsification.
Neural Magic's DeepSparse is able to integrate into popular deep learning libraries (e.g.
Hugging Face, Ultralytics) allowing you to leverage DeepSparse for loading and deploying sparse models with ONNX. ONNX gives the flexibility to serve your model in a framework-agnostic environment. Support includes PyTorch, TensorFlow, Keras, and many other trameworks.
👍1
Part 1: NLP - ChatGPT Cheatsheet
(1) Text Generation
write an intro paragraph to a mystery novel
(2) Summarization
summarize this text: [long text]
(3) Open Domain Question Answering
when did Apollo 11 land on the moon
(4) Sentiment Analysis (few-shot or zero-shot)
I like pizza, positive
I don't like pizza, negative
sometimes, I like pizza sometimes I don't, neutral
while the movie was good, I sometimes though it was a bit dry,
(5) Table to Text
summarize the data in this table:
I like pizza, positive
I don't like bananas, negative
Sometimes I like pizza but sometimes I don't, neutral
While the movie in general was pretty good, I sometimes thought it was a bit dry,
(6) Text to Table
create a table from this text: create a 2 column table where the first column contains the stock ticker symbol for Apple, Google, Amazon, Meta, and the other column contains the names of the companies.
(7) Token Classification (few-shot or zero-shot)
classify the named entities in this text: George Washington and his troops crossed the Delaware River on December 25, 1776 during the American Revolutionary War.
(8) Dataset Generation (few-shot or zero-shot)
generate more datapoints from this text:
"contains no wit , only labored gags "
0 (negative)
"that loves its characters and communicates something rather beautiful about human nature " 1 (positive)
"remains utterly satisfied to remain the same throughout "
0 (negative)
(9) Machine Translation
translate this text into Portuguese: welcome to the matrix
(1) Text Generation
write an intro paragraph to a mystery novel
(2) Summarization
summarize this text: [long text]
(3) Open Domain Question Answering
when did Apollo 11 land on the moon
(4) Sentiment Analysis (few-shot or zero-shot)
I like pizza, positive
I don't like pizza, negative
sometimes, I like pizza sometimes I don't, neutral
while the movie was good, I sometimes though it was a bit dry,
(5) Table to Text
summarize the data in this table:
I like pizza, positive
I don't like bananas, negative
Sometimes I like pizza but sometimes I don't, neutral
While the movie in general was pretty good, I sometimes thought it was a bit dry,
(6) Text to Table
create a table from this text: create a 2 column table where the first column contains the stock ticker symbol for Apple, Google, Amazon, Meta, and the other column contains the names of the companies.
(7) Token Classification (few-shot or zero-shot)
classify the named entities in this text: George Washington and his troops crossed the Delaware River on December 25, 1776 during the American Revolutionary War.
(8) Dataset Generation (few-shot or zero-shot)
generate more datapoints from this text:
"contains no wit , only labored gags "
0 (negative)
"that loves its characters and communicates something rather beautiful about human nature " 1 (positive)
"remains utterly satisfied to remain the same throughout "
0 (negative)
(9) Machine Translation
translate this text into Portuguese: welcome to the matrix
👍3🔥1
Part 2: Code - ChatGPT Cheatsheet
(10) Code Generation
show me how to make an http request in Python
(11) Code Explanation
explain this python code:
(12) Docstrings Generation
write a docstring description for this function:
(13) Programming Language Conversion
convert this code from Python to Javascript:
(14) Data Object Conversions (JSON, XML, CSV etc.)
convert this JSON object into XML:
(15) Knowledge Graph Generation
convert this text into nodes and edges: Babe Ruth joined the New York Yankees in 1920. The Boston Red Sox sold his contract to the Yankees. He played for the Yankees from 1920 to 1934, and during that time he established himself as one of the greatest players in baseball history.
(16) HTML to Text (Web Scraping)
convert this HTML to text:
(10) Code Generation
show me how to make an http request in Python
(11) Code Explanation
explain this python code:
from deepsparse import Pipeline
qa_pipeline = Pipeline.create(task="question-answering")
inference = qa_pipeline(question="What's my name?", context="My name is Snorlax")
>> {'score': 0.9947717785835266, 'start': 11, 'end': 18, 'answer': 'Snorlax'}
(12) Docstrings Generation
write a docstring description for this function:
import requests
def make_get_request(url):
response = requests.get(url)
return response.status_code, response.text
make_get_request('https://www.example.com')
(13) Programming Language Conversion
convert this code from Python to Javascript:
print("hello world")
(14) Data Object Conversions (JSON, XML, CSV etc.)
convert this JSON object into XML:
{"Name":{"0":"John Smith","1":"Jane Doe","2":"Bob Johnson","3":"Samantha Williams"},"Age":{"0":32,"1":28,"2":45,"3":40},"Gender":{"0":"Male","1":"Female","2":"Male","3":"Female"},"O ccupation":{"0":"Software Developer","1":"Data Analyst","2":"Project Manager","3":"Marketing Director"}}
(15) Knowledge Graph Generation
convert this text into nodes and edges: Babe Ruth joined the New York Yankees in 1920. The Boston Red Sox sold his contract to the Yankees. He played for the Yankees from 1920 to 1934, and during that time he established himself as one of the greatest players in baseball history.
(16) HTML to Text (Web Scraping)
convert this HTML to text:
<h1 class="heading1" id="neural-magic-platform-documentation">Neural Magic Platform Documentation</h1>
👍2🔥1