How copilot works at the high level, https://youtu.be/B2-8wrF9Okc.
YouTube
How Microsoft 365 Copilot works
Get an inside look at how large language models (LLMs) work when you connect them to the data in your organization. See what makes this possible and how the process respects your privacy to keep data safe with Microsoft 365 Copilot. The LLM for Copilot forβ¦
State of GPT talk by Andrej Karpathy: https://www.youtube.com/watch?v=bZQun8Y4L2A&t=373s
Would highly recommend watching the above! A 45-minute lecture going over the State of Generative LLMs, how are they trained, what they can and can't do, advanced techniques like CoT, ReAct, Reflection, BabyAGI, and Agents in general and finally some great tips on using LLMs in production. Pretty simple but very very informative
Would highly recommend watching the above! A 45-minute lecture going over the State of Generative LLMs, how are they trained, what they can and can't do, advanced techniques like CoT, ReAct, Reflection, BabyAGI, and Agents in general and finally some great tips on using LLMs in production. Pretty simple but very very informative
YouTube
State of GPT | BRK216HFS
Learn about the training pipeline of GPT assistants like ChatGPT, from tokenization to pretraining, supervised finetuning, and Reinforcement Learning from Human Feedback (RLHF). Dive deeper into practical techniques and mental models for the effective useβ¦
Continuous Learning_Startup & Investment
State of GPT talk by Andrej Karpathy: https://www.youtube.com/watch?v=bZQun8Y4L2A&t=373s Would highly recommend watching the above! A 45-minute lecture going over the State of Generative LLMs, how are they trained, what they can and can't do, advanced techniquesβ¦
Here's an https://assembly.ai transcript and chapter summaries:
ππΌ π€ π
https://www.assemblyai.com/playground/transcript/64kyzev80o-6ed4-4902-a066-7df25c363193
Andre Karpathi is a founding member of OpenAI. He will talk about how we train GPT assistants. In the second part he will take a look at how we can use these assistants effectively for your applications.
TRAINING NEURAL NETWORKS ON THE INTERNET
We have four major stages pretraining supervised fine tuning, reward modeling, reinforcement learning. In each stage we have a data set that powers that stage. And then we have an algorithm that for our purposes will be an objective for training a neural network.
GPT 3.1: BASE MODELS AND AGENTS
The GPT four model that you might be interacting with over API is not a base model, it's an assistant model. You can even trick base models into being assistants. Instead we have a different path to make actual GPT assistance, not just base model document completers.
NEUROANATOMY 2.8
In the reward modeling step, what we're going to do is we're now going to shift our data collection to be of the form of comparisons. Now, because we have a reward model, we can score the quality of any arbitrary completion for any given prompt. And then at the end, you could deploy a Rlhf model.
COGNITIVE PROCESSES AND GPT
How do we best apply a GPT assistant model to your problems? Think about the rich internal monologue and tool use and how much work actually goes computationally in your brain to generate this one final sentence. From GPT's perspective, this is just a sequence of tokens.
TREE OF THOUGHT AND PROMPT ENGINEERING
A lot of people are really playing around with kind of prompt engineering to bring back some of these abilities that we sort of have in our brain for LLMs. I think this is kind of an equivalent of AlphaGo but for text. I would not advise people to use it in practical applications.
WHAT ARE THE QUIRKS OF LLMS?
The next thing that I find kind of interesting is that LLMs don't want to succeed, they want to imitate. And so at test time, you actually have to ask for a good performance. Next up, I think a lot of people are really interested in basically retrieval augmented generation.
CONSTRAINT PROMPTING IN LLMS
Next, I wanted to briefly talk about constraint prompting. This is basically techniques for forcing a certain template in the outputs of LLMs. And I think this kind of constraint sampling is also extremely interesting.
FINE-TUNING A LANGUAGE MODEL
You can get really far with prompt engineering, but it's also possible to think about fine tuning your models. Fine tuning is a lot more technically involved. It requires human data contractors for data sets and or synthetic data pipelines. Break up your task into two major parts.
LIMITS TO FULLY AUTONOMOUS LLMS
There's a large number of limitations to LLMs today, so I would keep that definitely in mind for all your applications models. My recommendation right now is use LLMs in low stakes applications, combine them with always with human oversight. Think copilots instead of completely autonomous agents.
π§πΌββοΈ π§π»
ππΌ π€ π
https://www.assemblyai.com/playground/transcript/64kyzev80o-6ed4-4902-a066-7df25c363193
Andre Karpathi is a founding member of OpenAI. He will talk about how we train GPT assistants. In the second part he will take a look at how we can use these assistants effectively for your applications.
TRAINING NEURAL NETWORKS ON THE INTERNET
We have four major stages pretraining supervised fine tuning, reward modeling, reinforcement learning. In each stage we have a data set that powers that stage. And then we have an algorithm that for our purposes will be an objective for training a neural network.
GPT 3.1: BASE MODELS AND AGENTS
The GPT four model that you might be interacting with over API is not a base model, it's an assistant model. You can even trick base models into being assistants. Instead we have a different path to make actual GPT assistance, not just base model document completers.
NEUROANATOMY 2.8
In the reward modeling step, what we're going to do is we're now going to shift our data collection to be of the form of comparisons. Now, because we have a reward model, we can score the quality of any arbitrary completion for any given prompt. And then at the end, you could deploy a Rlhf model.
COGNITIVE PROCESSES AND GPT
How do we best apply a GPT assistant model to your problems? Think about the rich internal monologue and tool use and how much work actually goes computationally in your brain to generate this one final sentence. From GPT's perspective, this is just a sequence of tokens.
TREE OF THOUGHT AND PROMPT ENGINEERING
A lot of people are really playing around with kind of prompt engineering to bring back some of these abilities that we sort of have in our brain for LLMs. I think this is kind of an equivalent of AlphaGo but for text. I would not advise people to use it in practical applications.
WHAT ARE THE QUIRKS OF LLMS?
The next thing that I find kind of interesting is that LLMs don't want to succeed, they want to imitate. And so at test time, you actually have to ask for a good performance. Next up, I think a lot of people are really interested in basically retrieval augmented generation.
CONSTRAINT PROMPTING IN LLMS
Next, I wanted to briefly talk about constraint prompting. This is basically techniques for forcing a certain template in the outputs of LLMs. And I think this kind of constraint sampling is also extremely interesting.
FINE-TUNING A LANGUAGE MODEL
You can get really far with prompt engineering, but it's also possible to think about fine tuning your models. Fine tuning is a lot more technically involved. It requires human data contractors for data sets and or synthetic data pipelines. Break up your task into two major parts.
LIMITS TO FULLY AUTONOMOUS LLMS
There's a large number of limitations to LLMs today, so I would keep that definitely in mind for all your applications models. My recommendation right now is use LLMs in low stakes applications, combine them with always with human oversight. Think copilots instead of completely autonomous agents.
π§πΌββοΈ π§π»
Assemblyai
AssemblyAI | AI models to transcribe and understand speech
With AssemblyAI's industry-leading Speech AI models, transcribe speech to text and extract insights from your voice data.
In this post, I try to answer specific questions about the internals of Copilot, while also describing some interesting observations I made as I combed through the code. I will provide pointers to the relevant code for almost everything I talk about, so that interested folks can take a look at the code themselves.
https://thakkarparth007.github.io/copilot-explorer/posts/copilot-internals
https://thakkarparth007.github.io/copilot-explorer/posts/copilot-internals
<μκ·Ήμ μ€μ΄κ³ μκ°μ λ리기>
μμ¦ νλμΈλ€μ κ±°μ ADHD μνλ‘ μΌμ νλ€κ³ μκ°μ΄ λλ λ©΄μ΄ μλ€. μ§μμ μΌλ‘ λμ κ°λμ μκ·Ήμ μμ μ λ ΈμΆμν€κΈ° μ½κΈ° λλ¬Έμ΄λ€. μ΄λ° νκ²½μμμ λ νλμ μ°¨λΆνκ² μ§μ€νκ³ κΉμ΄ μλ μ¬κ³ λ₯Ό νκΈ°κ° νλ€λ€.
λ κ°μ§ μ¬λ‘λ₯Ό λ¨Όμ μκ°νκ² λ€.
μ¬λ‘ 1)
λ΄κ° μλ Kλͺ¨μ¨λ λκΈ°μ μ§μμ΄μλλ°, ν루μ μ μ¬μμ λ€μ΄μ€λ μ 무 μμ²λ§ μ백건μ΄λΌκ³ νλ€. κ·Έλμ λ λ§λ€ λ°€ 11μμ ν΄κ·Όμ νκ³ μμλ€.
κ·Έλ¬λ€κ° λμκ²μ μ μμΌ μ΄μΌκΈ°λ₯Ό λ£κ³ μ€νμ ν΄λ³΄κΈ°λ‘ κ²°μ¬νλ€. μ μ ν΄κ·Ό. κ·Έλμ νμ₯μκ² μ μμ νλ€. μ€λλΆν° 18μ μ μ ν΄κ·Όμ νκ² λ€. νΉμ¬ μΌ μ²λ¦¬κ° μ‘°κΈμ΄λΌλ λ¨μ΄μ§λ€λ λλμ΄ λ€λ©΄ μκΈ°ν΄λΌ. λ°λ‘ μ볡νκ² λ€. κ·Έλ¬κ³ κ·Έλ λΆν° 18μ ν΄κ·Όμ νλ€. μ§μ μ€λ©΄ μ λ 7μλΆν° 9μκΉμ§ λ μκ°μ© 6μ΄ μμ΄λ λμ쀬λ€κ³ νλ€. κ·Έμ κΉμ§ μμ΄μκ² μλΉ λ μλ μ‘΄μ¬μλ€. μ£Όμ€μλ λ°€ 11μμ μ€κ³ , μμΉ¨μλ μκΈ°λ³΄λ€ λ¨Όμ λκ°κ³ μ£Όλ§μλ κ³μ μ°λ¬μ Έ μμμΌλ. κ·Έλ° μμ΄μκ² "μλΉ "κ° μκΈ΄ κ±°λ€.
κ·Όλ° λ¬Έμ κ° νλ μμλ€. μ μ ν΄κ·Όμ νμΌλ λ€ μ²λ¦¬ λͺ»ν μΌλ€μ΄ λ¬Έμ . κ·Έλ°λ° 보μλ¬Έμ λλ¬Έμ μ§μμ νμ¬ μ»΄ν¨ν°λ μλ£μ μ κ·Όν μκ° μμλ€. κ·Έλμ κ·Έκ° λμμΌλ‘ νλ κ±°λ λ°€ 11μλΆν° 1μκΉμ§ λ μκ° λμ μκΈ° μ± μμ μ΄λ©΄μ§ νΌμΉκ³ μμμ μ€λ νλ μΌλ€, λ΄μΌ ν μΌλ€μ μ΄λ»κ² ν΄μΌ λ νλͺ νκ² μ²λ¦¬ν κ±΄κ° μ λ΅μ μ§λ κ±°μλ€. κ·Έκ±Έ λ λ§λ€ νλ€.
κ·Έλ¬κ³ λ€μλ μΆκ·Όμ νλ μ 무 μμ² μ€μ 50% μ΄μμ μλμΌλ‘ ν΄κ²°λ κ²½μ°κ° λ§μκ³ (μμ²ν λΆμμμ λ΅λ΅νλ μ체μ μΌλ‘ ν΄κ²°), λ¨μ 50%λ μ§λ λ°€μ κ³ λ―Όν κ²°κ³Ό λ νλͺ ν λ°©λ²μΌλ‘ μ²λ¦¬λ₯Ό ν΄μ κΈλ°© λλΌ μ μμλ€.
λ¬Όλ‘ κ²°κ³Όμ μΌλ‘ λ°€ 11μ ν΄κ·Όν λλ³΄λ€ μλ©΄μκ°μ΄ μ€μλ€κ³ νλ€. μμ μλ μ§μ λ€μ΄μ€λ©΄ λ°λ‘ μ°λ¬μ Έμ μ€μΌλκΉ. νμ§λ§ λͺΈμ΄ λλΌλ μλμ§λ ν¨μ¬ μ’μμ‘λ€κ³ νλ€.
μ¬λ‘ 2)
μμ μ κ΅°λμμ μλ λ°°μΉλ₯Ό λ°κ³ ν΄λΉ λΆλμ κ°κ³ μ¬μλ₯Ό λ°°λΉ λ°μλ€. κ·Όλ° κ·Έ μ¬μ μΌκ΅΄μ λ³΄κΈ°κ° νλ κ±°λ€. λ©°μΉ μ§λ μκ² λλλ° κ·Έ μ¬μ μ μμΌμ΄ 1μ£ΌμΌ λ€λλ€. λ΄ μ¬μμ 보μ§μ λλ μ λΉκ³Ό μ무λ³. μλ νλ μΌμ΄ λ§κ³ 볡μ‘ν΄μ ν΅μ 1λ μ λλ μΈμμΈκ³λ₯Ό λ°μμΌ μ λλ‘ μΌμ νκ² λλ€κ³ νλ€. κ·Όλ° μ΄ μ¬λμ 1μ£ΌμΌ λ€μ μ μνκ³ , μ΄ 1μ£ΌμΌλ μΌλ λ±λ μ§λκ°κ³ μμλ€. κ°λ μ λΉκ³Όμ λ΄λ €μμλ κΆκΈν κ±° λ¬Όμ΄λ΄νκ³ λμμκ±°λ νλ μ λ. μ λ§ λ¬Έμ λ μ΄ μ¬λμ 보μ§μ μ ννκ² νμ νλ μ¬λμ΄ κ°λΆλ λ³ μ€μ μ무λ μλ€λ κ±°.
κ²°κ΅ λλ κ±°μ μ무κ²λ λ°°μ°μ§λ λͺ»ν μ±λ‘ μ¬μκ° μ μμ νκ³ , μ 무 λ§€λ΄μΌλ νλ μμλ€. μ°Έκ³ ν μλ£κ° μ ν μλ μν©.
κ³ λ―Όνλ€κ° κ²°κ΅ νκ² λ μ νμ μ리μ μμΉμΌλ‘ μκ°ν΄μ νλνμλ κ±°μλ€. μ΄λ€ λ¬Έμ μν©μ΄ λ°μνλ©΄ λ΄κ° μκ°νλ κΈ°λ³Έμ μΈ μ리μ λ°λΌ(μ컨λ μ΄λ»κ² νλ κ²μ΄ μ‘κ΅°μκ² μ΄λμ΄ λλ νλμΈκ° κ°μ) λ Όλ¦¬μ μΌλ‘ λ§μ΄ λλ νλμ μκ°ν΄μ νλ€. λ΄κ° λͺ¨λ κ·μΉκ³Ό λ²μ μ€κ³νλ©΄μ νλ€κ³ ν κΉ. μ΄λ¬λκΉ κ±°μΉ κ²μ΄ μμλ€. λλ μ§ κΉκ² μκ°ν΄μ κ·Έλλ‘ νλ©΄ λ€ ν리λλΌλ.
κ·Όλ° μμΈλ‘ μ΄ λ°©λ²μ΄ μ ν΅νλ€. κ·Έλμ κ²°κ΅ λ΄κ° λͺ¨λ 체κ³λ₯Ό λ§λ€μκ³ μ΄κ±Έλ‘ μλ λͺλ² λ°μλ€. κ΅°λ¨μμ κ°μ¬ λ΄λ €μμ λμλ λ΄κ° ꡰ무μμ΄λ μ₯κ΅λ€ λͺ¨μλκ³ λΉκ³΅μ κ°μ°λ νλ€.
----
λλ‘λ μΈλΆ μκ·Ή/μ 보λ₯Ό μ ννκ³ μκ°μ μ§μ€νλ κ²μ΄ λμμ΄ λλ κ²½μ°κ° μλ€. λ€μΌλ‘ μκ°νλ κ·Όμ‘κ³Ό κΈ°μ λ λκ² λλ€.
κ·Έλμ λλ μ컨λ λ€μκ³Ό κ°μ κ²λ€μ μΆμ²νλ€:
* λ²κ·Έκ° λμ€λ©΄ λ°λ‘ κ²μμ°½μ λλ €λ£μ§ λ§κ³ μ μ΄λ 5λΆ, 10λΆκ°μ λ°±μ§μλ€κ° λ¬Έμ μν©μ κ·Έλ €λ³΄κ³ μμΈ μ μΆν΄λ³΄κΈ°
* μ ν λͺ¨λ₯΄λ λΆμΌμ μ λ¬Ένκ³ μΆμ λ μΈν°λ· κ²μ보λ€λ μμ μμ μλκ°λ μ± μ€μ μ€νμΌμ΄ λ€λ₯Έ μ± 3κΆμ ꡬμ ν΄μ μλ₯Ό λΉκ΅ν΄λ³΄λ©΄μ 보기 (λλ μ΄κ±Έ bounded explorationμ΄λΌκ³ λΆλ₯Έλ€ -- μ΄κ±Έ μνλ©΄ μ΄λ νλ μ λλ‘ λ³΄μ§ μκ³ κ³μ κΉμ§κΉμ§ λλ©΄μ μκ°μ λλΉνκΈ° μ½λ€)
* ν΄κ²°ν΄μΌν 볡μ‘ν λ¬Έμ κ° μμ κ²½μ° μΆκ° μ 보λ₯Ό μ ν μ°Ύμ§ μκ³ λ°±μ§λ₯Ό νΌμ³λκ³ 30λΆ λμ λ Όλ¦¬μ λ΄ μκ°, λ΄ κ³Όκ±°κ²½νμΌλ‘λ§ ν΄κ²°μ± μ μ€κ³ν΄ 보기
https://www.facebook.com/100000557305988/posts/pfbid02joCFDgeyR58vuv2MyZqQWJ1cf7FwrYZHS6FLq9ox8Bqu2RE9cV3HdgzWdHJvopjkl/?mibextid=jf9HGS
μμ¦ νλμΈλ€μ κ±°μ ADHD μνλ‘ μΌμ νλ€κ³ μκ°μ΄ λλ λ©΄μ΄ μλ€. μ§μμ μΌλ‘ λμ κ°λμ μκ·Ήμ μμ μ λ ΈμΆμν€κΈ° μ½κΈ° λλ¬Έμ΄λ€. μ΄λ° νκ²½μμμ λ νλμ μ°¨λΆνκ² μ§μ€νκ³ κΉμ΄ μλ μ¬κ³ λ₯Ό νκΈ°κ° νλ€λ€.
λ κ°μ§ μ¬λ‘λ₯Ό λ¨Όμ μκ°νκ² λ€.
μ¬λ‘ 1)
λ΄κ° μλ Kλͺ¨μ¨λ λκΈ°μ μ§μμ΄μλλ°, ν루μ μ μ¬μμ λ€μ΄μ€λ μ 무 μμ²λ§ μ백건μ΄λΌκ³ νλ€. κ·Έλμ λ λ§λ€ λ°€ 11μμ ν΄κ·Όμ νκ³ μμλ€.
κ·Έλ¬λ€κ° λμκ²μ μ μμΌ μ΄μΌκΈ°λ₯Ό λ£κ³ μ€νμ ν΄λ³΄κΈ°λ‘ κ²°μ¬νλ€. μ μ ν΄κ·Ό. κ·Έλμ νμ₯μκ² μ μμ νλ€. μ€λλΆν° 18μ μ μ ν΄κ·Όμ νκ² λ€. νΉμ¬ μΌ μ²λ¦¬κ° μ‘°κΈμ΄λΌλ λ¨μ΄μ§λ€λ λλμ΄ λ€λ©΄ μκΈ°ν΄λΌ. λ°λ‘ μ볡νκ² λ€. κ·Έλ¬κ³ κ·Έλ λΆν° 18μ ν΄κ·Όμ νλ€. μ§μ μ€λ©΄ μ λ 7μλΆν° 9μκΉμ§ λ μκ°μ© 6μ΄ μμ΄λ λμ쀬λ€κ³ νλ€. κ·Έμ κΉμ§ μμ΄μκ² μλΉ λ μλ μ‘΄μ¬μλ€. μ£Όμ€μλ λ°€ 11μμ μ€κ³ , μμΉ¨μλ μκΈ°λ³΄λ€ λ¨Όμ λκ°κ³ μ£Όλ§μλ κ³μ μ°λ¬μ Έ μμμΌλ. κ·Έλ° μμ΄μκ² "μλΉ "κ° μκΈ΄ κ±°λ€.
κ·Όλ° λ¬Έμ κ° νλ μμλ€. μ μ ν΄κ·Όμ νμΌλ λ€ μ²λ¦¬ λͺ»ν μΌλ€μ΄ λ¬Έμ . κ·Έλ°λ° 보μλ¬Έμ λλ¬Έμ μ§μμ νμ¬ μ»΄ν¨ν°λ μλ£μ μ κ·Όν μκ° μμλ€. κ·Έλμ κ·Έκ° λμμΌλ‘ νλ κ±°λ λ°€ 11μλΆν° 1μκΉμ§ λ μκ° λμ μκΈ° μ± μμ μ΄λ©΄μ§ νΌμΉκ³ μμμ μ€λ νλ μΌλ€, λ΄μΌ ν μΌλ€μ μ΄λ»κ² ν΄μΌ λ νλͺ νκ² μ²λ¦¬ν κ±΄κ° μ λ΅μ μ§λ κ±°μλ€. κ·Έκ±Έ λ λ§λ€ νλ€.
κ·Έλ¬κ³ λ€μλ μΆκ·Όμ νλ μ 무 μμ² μ€μ 50% μ΄μμ μλμΌλ‘ ν΄κ²°λ κ²½μ°κ° λ§μκ³ (μμ²ν λΆμμμ λ΅λ΅νλ μ체μ μΌλ‘ ν΄κ²°), λ¨μ 50%λ μ§λ λ°€μ κ³ λ―Όν κ²°κ³Ό λ νλͺ ν λ°©λ²μΌλ‘ μ²λ¦¬λ₯Ό ν΄μ κΈλ°© λλΌ μ μμλ€.
λ¬Όλ‘ κ²°κ³Όμ μΌλ‘ λ°€ 11μ ν΄κ·Όν λλ³΄λ€ μλ©΄μκ°μ΄ μ€μλ€κ³ νλ€. μμ μλ μ§μ λ€μ΄μ€λ©΄ λ°λ‘ μ°λ¬μ Έμ μ€μΌλκΉ. νμ§λ§ λͺΈμ΄ λλΌλ μλμ§λ ν¨μ¬ μ’μμ‘λ€κ³ νλ€.
μ¬λ‘ 2)
μμ μ κ΅°λμμ μλ λ°°μΉλ₯Ό λ°κ³ ν΄λΉ λΆλμ κ°κ³ μ¬μλ₯Ό λ°°λΉ λ°μλ€. κ·Όλ° κ·Έ μ¬μ μΌκ΅΄μ λ³΄κΈ°κ° νλ κ±°λ€. λ©°μΉ μ§λ μκ² λλλ° κ·Έ μ¬μ μ μμΌμ΄ 1μ£ΌμΌ λ€λλ€. λ΄ μ¬μμ 보μ§μ λλ μ λΉκ³Ό μ무λ³. μλ νλ μΌμ΄ λ§κ³ 볡μ‘ν΄μ ν΅μ 1λ μ λλ μΈμμΈκ³λ₯Ό λ°μμΌ μ λλ‘ μΌμ νκ² λλ€κ³ νλ€. κ·Όλ° μ΄ μ¬λμ 1μ£ΌμΌ λ€μ μ μνκ³ , μ΄ 1μ£ΌμΌλ μΌλ λ±λ μ§λκ°κ³ μμλ€. κ°λ μ λΉκ³Όμ λ΄λ €μμλ κΆκΈν κ±° λ¬Όμ΄λ΄νκ³ λμμκ±°λ νλ μ λ. μ λ§ λ¬Έμ λ μ΄ μ¬λμ 보μ§μ μ ννκ² νμ νλ μ¬λμ΄ κ°λΆλ λ³ μ€μ μ무λ μλ€λ κ±°.
κ²°κ΅ λλ κ±°μ μ무κ²λ λ°°μ°μ§λ λͺ»ν μ±λ‘ μ¬μκ° μ μμ νκ³ , μ 무 λ§€λ΄μΌλ νλ μμλ€. μ°Έκ³ ν μλ£κ° μ ν μλ μν©.
κ³ λ―Όνλ€κ° κ²°κ΅ νκ² λ μ νμ μ리μ μμΉμΌλ‘ μκ°ν΄μ νλνμλ κ±°μλ€. μ΄λ€ λ¬Έμ μν©μ΄ λ°μνλ©΄ λ΄κ° μκ°νλ κΈ°λ³Έμ μΈ μ리μ λ°λΌ(μ컨λ μ΄λ»κ² νλ κ²μ΄ μ‘κ΅°μκ² μ΄λμ΄ λλ νλμΈκ° κ°μ) λ Όλ¦¬μ μΌλ‘ λ§μ΄ λλ νλμ μκ°ν΄μ νλ€. λ΄κ° λͺ¨λ κ·μΉκ³Ό λ²μ μ€κ³νλ©΄μ νλ€κ³ ν κΉ. μ΄λ¬λκΉ κ±°μΉ κ²μ΄ μμλ€. λλ μ§ κΉκ² μκ°ν΄μ κ·Έλλ‘ νλ©΄ λ€ ν리λλΌλ.
κ·Όλ° μμΈλ‘ μ΄ λ°©λ²μ΄ μ ν΅νλ€. κ·Έλμ κ²°κ΅ λ΄κ° λͺ¨λ 체κ³λ₯Ό λ§λ€μκ³ μ΄κ±Έλ‘ μλ λͺλ² λ°μλ€. κ΅°λ¨μμ κ°μ¬ λ΄λ €μμ λμλ λ΄κ° ꡰ무μμ΄λ μ₯κ΅λ€ λͺ¨μλκ³ λΉκ³΅μ κ°μ°λ νλ€.
----
λλ‘λ μΈλΆ μκ·Ή/μ 보λ₯Ό μ ννκ³ μκ°μ μ§μ€νλ κ²μ΄ λμμ΄ λλ κ²½μ°κ° μλ€. λ€μΌλ‘ μκ°νλ κ·Όμ‘κ³Ό κΈ°μ λ λκ² λλ€.
κ·Έλμ λλ μ컨λ λ€μκ³Ό κ°μ κ²λ€μ μΆμ²νλ€:
* λ²κ·Έκ° λμ€λ©΄ λ°λ‘ κ²μμ°½μ λλ €λ£μ§ λ§κ³ μ μ΄λ 5λΆ, 10λΆκ°μ λ°±μ§μλ€κ° λ¬Έμ μν©μ κ·Έλ €λ³΄κ³ μμΈ μ μΆν΄λ³΄κΈ°
* μ ν λͺ¨λ₯΄λ λΆμΌμ μ λ¬Ένκ³ μΆμ λ μΈν°λ· κ²μ보λ€λ μμ μμ μλκ°λ μ± μ€μ μ€νμΌμ΄ λ€λ₯Έ μ± 3κΆμ ꡬμ ν΄μ μλ₯Ό λΉκ΅ν΄λ³΄λ©΄μ 보기 (λλ μ΄κ±Έ bounded explorationμ΄λΌκ³ λΆλ₯Έλ€ -- μ΄κ±Έ μνλ©΄ μ΄λ νλ μ λλ‘ λ³΄μ§ μκ³ κ³μ κΉμ§κΉμ§ λλ©΄μ μκ°μ λλΉνκΈ° μ½λ€)
* ν΄κ²°ν΄μΌν 볡μ‘ν λ¬Έμ κ° μμ κ²½μ° μΆκ° μ 보λ₯Ό μ ν μ°Ύμ§ μκ³ λ°±μ§λ₯Ό νΌμ³λκ³ 30λΆ λμ λ Όλ¦¬μ λ΄ μκ°, λ΄ κ³Όκ±°κ²½νμΌλ‘λ§ ν΄κ²°μ± μ μ€κ³ν΄ 보기
https://www.facebook.com/100000557305988/posts/pfbid02joCFDgeyR58vuv2MyZqQWJ1cf7FwrYZHS6FLq9ox8Bqu2RE9cV3HdgzWdHJvopjkl/?mibextid=jf9HGS
Facebook
Log in or sign up to view
See posts, photos and more on Facebook.
π5
Continuous Learning_Startup & Investment
Could one Language Learning Model handle all programming languages? Or should we tailor a model for each? What's your take? #LLM #ProgrammingLanguages https://www.linkedin.com/posts/mateizaharia_introducing-english-as-the-new-programming-activity-7080242815120637952β¦
λ무λ μ¬μμ§λ λ°μ΄ν° μ¬μ΄μΈμ€ π
ChatGPT λλΆμ λ°μ΄ν° μ¬μ΄μΈμ€κ° λλλλ‘ μ¬μμ§κ³ μμ΅λλ€. π€ μ΄μ μλ ν΄λ¬μ€ν°λ§μ μ΄μ©ν μλ μ°¨νΈλ₯Ό λ§λ€κΈ° μν΄ νμνλ μ§μλ€μ λ€μκ³Ό κ°μμ΅λλ€.
## Google Colab νμ΅ μκ° π:
1. κΈ°λ³Έμ μΈ μ¬μ©λ²μ μ΅νλλ° μ½ 1μ£Ό μ λμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
2. λ 볡μ‘ν μμ , μλ₯Ό λ€μ΄ μΈλΆ λ°μ΄ν°λ₯Ό λΆλ¬μ€κ±°λ, ν° κ·λͺ¨μ λ°μ΄ν°λ₯Ό μ²λ¦¬νλ λ°©λ² λ±μ νμ΅νλλ° μΆκ°μ μΈ 1~2μ£Όμ μκ°μ΄ νμνμ΅λλ€.
## λ°μ΄ν° κ³Όν λ°°κ²½ μ§μ π:
1. ν΄λ¬μ€ν°λ§: κΈ°λ³Έμ μΈ μ΄ν΄λ₯Ό μν΄ 1~2μ£Όμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
2. ν΄λ¬μ€ν°λ§ νκ° μ§ν: κ° μ§νμ λν κΈ°λ³Έμ μΈ μ΄ν΄λ₯Ό μν΄ 1μ£Ό μ λμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
3. λ°μ΄ν° λΆμ λ° μ²λ¦¬: μ΄ μ£Όμ λ κ΄λ²μνλ―λ‘, κΈ°λ³Έμ μΈ λ°μ΄ν° μ μ²λ¦¬ λ° λΆμ κΈ°λ²μ μ΅λνλ λ°λ μ΅μν 1~2κ°μμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
## API μ§μ π»:
1. Firebase Firestore: Firestoreμ κΈ°λ³Έμ μΈ μ¬μ©λ²μ λ°°μ°λ λ°λ 1~2μ£Όμ μκ°μ΄ μμλμ΅λλ€.
## μ½λ© μ€ν¬ π₯οΈ:
1. νμ΄μ¬: νμ΄μ¬μ κΈ°λ³Έ λ¬Έλ²μ μ΅νλ λ°λ μ½ 1~2κ°μμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
2. NumPy: κΈ°λ³Έμ μΈ NumPy κΈ°λ₯μ μ΅νλ λ°λ μ½ 1~2μ£Όμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
3. Matplotlib: κΈ°λ³Έμ μΈ κ·Έλνλ₯Ό 그리λ λ°©λ²μ λ°°μ°λ λ°λ μ½ 1μ£Όμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
μμμ μ μν κ° νλͺ©μ νμ΅ μκ°μ ν©μ°νλ©΄ λλ΅μ μΌλ‘ λ€μκ³Ό κ°μ΅λλ€: λ°μ΄ν° κ³Όν κΈ°μ΄: μ½ 2~4κ°μ, API μ§μ (Firebase Firestore): μ½ 1~2μ£Ό, μ½λ© μ€ν¬ (νμ΄μ¬, NumPy, Matplotlib): μ½ 2~3κ°μ. λ°λΌμ μ΄ νμ΅ μκ°μ μ½ 4~7κ°μ μ λλ‘ μμν μ μμ΅λλ€. π
----
# ChatGPTλ₯Ό μ΄μ©νλ λ€μκ³Ό κ°μ΄ λμ΄λ²λ Έμ΅λλ€. π
AIκ° μ½λ©κ³Ό μ€ν μ€κ³λ₯Ό λ΄λΉνλ―λ‘ κ·Έ λΆλΆμ νμ΅ μκ°μ μ μΈν μ μμ΅λλ€. κ·Έλ¬λ―λ‘, λ¨μ λΆλΆμ λ°μ΄ν° κ³Όνμ λν κ°λ²Όμ΄ λ°°κ²½ μ§μκ³Ό Google Colabμ λν μ΄ν΄μ λλ€. π€
1. λ°μ΄ν° κ³Όν λ°°κ²½ μ§μ: AI λΉμμ μ€λͺ κ³Ό κ°μ΄λλ‘, μ½ 1κ°μλ‘ λ¨μΆλ μ μμ΅λλ€. κ²½μ°μ λ°λΌμλ 2μ£Όμλ κΈ°λ³Έ κ°λ μ νμ μ μμ΅λλ€.
2. Google Colab: AI λΉμμ λμμΌλ‘, νμ΅ μκ°μ μ½ 1μ£Όλ‘ μ€μΌ μ μμ΅λλ€. - μ¬μ€ 1μκ°λ§ ν΄λ λ κ² κ°κΈ΄ ν΄μοΏΌ
μ΄ κ²½μ°, μ΄ νμ΅ μκ°μ μ½ 1~2κ°μ μ λλ‘ μΆμ λ©λλ€. μ΄λ―Έ μ½λ© μ€ν¬κ³Ό API μ¬μ©μ λν μ§μμ΄ μλ€λ©΄, μ΄ μκ°μ λμ± λ¨μΆλ μ μμ΅λλ€. β
----
κ²°κ΅ μ΄λ³΄μμ κ²½μ° 6κ°μ μ½μ€ -> 1κ°μ μ½μ€κ° λ©λλ€. π λ°μ΄ν° μ¬μ΄μΈμ€ λ°°κ²½ μ§μμ μκ³ μκ³ νμ΄μ¬ λΌμ΄λΈλ¬λ¦¬ μ¬μ© λ°©λ²μ λͺ°λλ μ μ μ₯μμλ 3μ£Ό μ λμμ λμκ°μΌλ‘ λ¨μΆ λμμ΅λλ€. π² μ΄ μΈμλ λ°μ΄ν° κ³Όν μ λ°μ λ°°μ°λ €λ©΄ 4λ λ λͺ¨μλλλ€.
κ²°κ΅ μλμ΄ λ°μ΄ν° μ¬μ΄μΈν°μ€ νλͺ μ΄ ν μ μλ μΌμ΄ μ₯¬λμ΄ μ¬μ΄μΈν°μ€νΈμ μ₯¬λμ΄ λ°μ΄ν° μμ§λμ΄ 10λͺ μ΄μμ ν΄λΉνλ μΌμ΄ λμ΄λ²λ¦½λλ€.
μ€λ¦¬μ½λ°Έλ¦¬μμλ μ΄λ―Έ μ₯¬λμ΄ λ°μ΄ν° κ³Όνμλ€μ΄ λΉ λ₯Έ μλλ‘ μ§μ μ μκ³ μμ΅λλ€. π±
νκ΅μμμ κ³Όμ λ λ°λμ΄μΌ ν κ² κ°μ΅λλ€. μ€νλ € κ°μ μκ° λ΄μ λ κΉμ΄ μλ μ΄λ‘ μ λ°°μΈ μ μμ κ² κ°μ΅λλ€. λν μ€μ μ½λ©λ³΄λ€λ μ°κ΅¬ λ°©λ²λ‘ μ μ€μ μ λκ³ κ΅μ‘ μ€κ³λ₯Ό ν΄μΌ ν κ² κ°μ΅λλ€. λ°μ΄ν° μ¬μ΄μΈν°μ€νΈλ€μ΄ μ€λ¬΄ κΈ°μ λ³΄λ€ μ§μμ μΌλ‘ μν₯ νμ€ν λλ μν©μ΄ μ¬ κ² κ°μ΅λλ€.
---
μλ scatter plotμ μν΄ μ¬μ©ν prompt:
1. Get the latest 1000 samples from
2, tribeId is cluster id, x and y are the coordinates.
3. Measure the homogeneity and completeness using colab.
4. Visualize the results.
KmeansλΌκ³ λ§λ μ νλλ° μμμ κ°λ€ μ°λ€μ.
https://www.facebook.com/634740022/posts/pfbid0cuABUXxgECdMwZfQaZ9u88HqXaLoLKzdJxBGLSsfHMfUovKRdQnuybjUYc9sJycsl/?mibextid=jf9HGS
ChatGPT λλΆμ λ°μ΄ν° μ¬μ΄μΈμ€κ° λλλλ‘ μ¬μμ§κ³ μμ΅λλ€. π€ μ΄μ μλ ν΄λ¬μ€ν°λ§μ μ΄μ©ν μλ μ°¨νΈλ₯Ό λ§λ€κΈ° μν΄ νμνλ μ§μλ€μ λ€μκ³Ό κ°μμ΅λλ€.
## Google Colab νμ΅ μκ° π:
1. κΈ°λ³Έμ μΈ μ¬μ©λ²μ μ΅νλλ° μ½ 1μ£Ό μ λμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
2. λ 볡μ‘ν μμ , μλ₯Ό λ€μ΄ μΈλΆ λ°μ΄ν°λ₯Ό λΆλ¬μ€κ±°λ, ν° κ·λͺ¨μ λ°μ΄ν°λ₯Ό μ²λ¦¬νλ λ°©λ² λ±μ νμ΅νλλ° μΆκ°μ μΈ 1~2μ£Όμ μκ°μ΄ νμνμ΅λλ€.
## λ°μ΄ν° κ³Όν λ°°κ²½ μ§μ π:
1. ν΄λ¬μ€ν°λ§: κΈ°λ³Έμ μΈ μ΄ν΄λ₯Ό μν΄ 1~2μ£Όμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
2. ν΄λ¬μ€ν°λ§ νκ° μ§ν: κ° μ§νμ λν κΈ°λ³Έμ μΈ μ΄ν΄λ₯Ό μν΄ 1μ£Ό μ λμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
3. λ°μ΄ν° λΆμ λ° μ²λ¦¬: μ΄ μ£Όμ λ κ΄λ²μνλ―λ‘, κΈ°λ³Έμ μΈ λ°μ΄ν° μ μ²λ¦¬ λ° λΆμ κΈ°λ²μ μ΅λνλ λ°λ μ΅μν 1~2κ°μμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
## API μ§μ π»:
1. Firebase Firestore: Firestoreμ κΈ°λ³Έμ μΈ μ¬μ©λ²μ λ°°μ°λ λ°λ 1~2μ£Όμ μκ°μ΄ μμλμ΅λλ€.
## μ½λ© μ€ν¬ π₯οΈ:
1. νμ΄μ¬: νμ΄μ¬μ κΈ°λ³Έ λ¬Έλ²μ μ΅νλ λ°λ μ½ 1~2κ°μμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
2. NumPy: κΈ°λ³Έμ μΈ NumPy κΈ°λ₯μ μ΅νλ λ°λ μ½ 1~2μ£Όμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
3. Matplotlib: κΈ°λ³Έμ μΈ κ·Έλνλ₯Ό 그리λ λ°©λ²μ λ°°μ°λ λ°λ μ½ 1μ£Όμ νμ΅ μκ°μ΄ νμνμ΅λλ€.
μμμ μ μν κ° νλͺ©μ νμ΅ μκ°μ ν©μ°νλ©΄ λλ΅μ μΌλ‘ λ€μκ³Ό κ°μ΅λλ€: λ°μ΄ν° κ³Όν κΈ°μ΄: μ½ 2~4κ°μ, API μ§μ (Firebase Firestore): μ½ 1~2μ£Ό, μ½λ© μ€ν¬ (νμ΄μ¬, NumPy, Matplotlib): μ½ 2~3κ°μ. λ°λΌμ μ΄ νμ΅ μκ°μ μ½ 4~7κ°μ μ λλ‘ μμν μ μμ΅λλ€. π
----
# ChatGPTλ₯Ό μ΄μ©νλ λ€μκ³Ό κ°μ΄ λμ΄λ²λ Έμ΅λλ€. π
AIκ° μ½λ©κ³Ό μ€ν μ€κ³λ₯Ό λ΄λΉνλ―λ‘ κ·Έ λΆλΆμ νμ΅ μκ°μ μ μΈν μ μμ΅λλ€. κ·Έλ¬λ―λ‘, λ¨μ λΆλΆμ λ°μ΄ν° κ³Όνμ λν κ°λ²Όμ΄ λ°°κ²½ μ§μκ³Ό Google Colabμ λν μ΄ν΄μ λλ€. π€
1. λ°μ΄ν° κ³Όν λ°°κ²½ μ§μ: AI λΉμμ μ€λͺ κ³Ό κ°μ΄λλ‘, μ½ 1κ°μλ‘ λ¨μΆλ μ μμ΅λλ€. κ²½μ°μ λ°λΌμλ 2μ£Όμλ κΈ°λ³Έ κ°λ μ νμ μ μμ΅λλ€.
2. Google Colab: AI λΉμμ λμμΌλ‘, νμ΅ μκ°μ μ½ 1μ£Όλ‘ μ€μΌ μ μμ΅λλ€. - μ¬μ€ 1μκ°λ§ ν΄λ λ κ² κ°κΈ΄ ν΄μοΏΌ
μ΄ κ²½μ°, μ΄ νμ΅ μκ°μ μ½ 1~2κ°μ μ λλ‘ μΆμ λ©λλ€. μ΄λ―Έ μ½λ© μ€ν¬κ³Ό API μ¬μ©μ λν μ§μμ΄ μλ€λ©΄, μ΄ μκ°μ λμ± λ¨μΆλ μ μμ΅λλ€. β
----
κ²°κ΅ μ΄λ³΄μμ κ²½μ° 6κ°μ μ½μ€ -> 1κ°μ μ½μ€κ° λ©λλ€. π λ°μ΄ν° μ¬μ΄μΈμ€ λ°°κ²½ μ§μμ μκ³ μκ³ νμ΄μ¬ λΌμ΄λΈλ¬λ¦¬ μ¬μ© λ°©λ²μ λͺ°λλ μ μ μ₯μμλ 3μ£Ό μ λμμ λμκ°μΌλ‘ λ¨μΆ λμμ΅λλ€. π² μ΄ μΈμλ λ°μ΄ν° κ³Όν μ λ°μ λ°°μ°λ €λ©΄ 4λ λ λͺ¨μλλλ€.
κ²°κ΅ μλμ΄ λ°μ΄ν° μ¬μ΄μΈν°μ€ νλͺ μ΄ ν μ μλ μΌμ΄ μ₯¬λμ΄ μ¬μ΄μΈν°μ€νΈμ μ₯¬λμ΄ λ°μ΄ν° μμ§λμ΄ 10λͺ μ΄μμ ν΄λΉνλ μΌμ΄ λμ΄λ²λ¦½λλ€.
μ€λ¦¬μ½λ°Έλ¦¬μμλ μ΄λ―Έ μ₯¬λμ΄ λ°μ΄ν° κ³Όνμλ€μ΄ λΉ λ₯Έ μλλ‘ μ§μ μ μκ³ μμ΅λλ€. π±
νκ΅μμμ κ³Όμ λ λ°λμ΄μΌ ν κ² κ°μ΅λλ€. μ€νλ € κ°μ μκ° λ΄μ λ κΉμ΄ μλ μ΄λ‘ μ λ°°μΈ μ μμ κ² κ°μ΅λλ€. λν μ€μ μ½λ©λ³΄λ€λ μ°κ΅¬ λ°©λ²λ‘ μ μ€μ μ λκ³ κ΅μ‘ μ€κ³λ₯Ό ν΄μΌ ν κ² κ°μ΅λλ€. λ°μ΄ν° μ¬μ΄μΈν°μ€νΈλ€μ΄ μ€λ¬΄ κΈ°μ λ³΄λ€ μ§μμ μΌλ‘ μν₯ νμ€ν λλ μν©μ΄ μ¬ κ² κ°μ΅λλ€.
---
μλ scatter plotμ μν΄ μ¬μ©ν prompt:
1. Get the latest 1000 samples from
user_tribes collection2, tribeId is cluster id, x and y are the coordinates.
3. Measure the homogeneity and completeness using colab.
4. Visualize the results.
KmeansλΌκ³ λ§λ μ νλλ° μμμ κ°λ€ μ°λ€μ.
https://www.facebook.com/634740022/posts/pfbid0cuABUXxgECdMwZfQaZ9u88HqXaLoLKzdJxBGLSsfHMfUovKRdQnuybjUYc9sJycsl/?mibextid=jf9HGS
Facebook
Log in or sign up to view
See posts, photos and more on Facebook.
μ€λͺ°ν ν¬ μνλ ν, Conversation Threading
μμ λΌμ΄μ§μ΄λΌλ λ§₯λ½μμ Conv. Threading μ΄λ, μλμ μΌλ‘ λμ κ΄ν ν€μλλ€μ λνμ μΆκ°μ λ³΄λ‘ ν리λ―λ‘μ¨ μλλ°©(νΉμ κ·Έλ£Ήλ΄ νμΈ)μ΄ κ·Έ ν€μλλ€μ μ€μ€νμ¬ μμ°μ€λ½κ² ν ν°λ μ€ν ν λνκ° μ΄μ΄μ§λλ‘ νλ νμ λλ λνλ².
μλ₯Ό λ€μ΄, κ³ ν₯μ΄ μ΄λμΈμ? λΌκ³ νλ©΄ μΌλ°μ μΌλ‘λ "μ λ¨ μμ²μ΄μ." λΌκ³ λ¨λ΅μΌλ‘ λλΌ μ μλκ±Έ CTλ₯Ό μ νλ μ¬λμ βμ λ¨ μμ²μ΄μ, μ¬μ λ°€λ°λ€μ κ°κΉμ΄λ° μμ²λ§μ΅μ§λ‘ μ λͺ νκ³ μννμ΅ νμλ λΆλ€μ μ±μ§μμ.β λΌκ³ μκΈ°λ₯Ό νλ€. κ·Έλ¬λ©΄ κ·Έ κ·Έλ£Ήμμ λꡬλ μ°λ λλ₯Ό μ΄μ΄κ° μ μλ€. μ¬μ λ°€λ°λ€ λ Έλ μκΈ°λ₯Ό ν μλ, μ¬μ μ¬νκ° μκΈ°λ₯Ό ν μλ, μ΅μ§ μκΈ°λ μννμ΅μ λν μ§λ¬Έμ ν μλμλ€.
κ³Όκ±° λ΄κ² μν₯μ 쀬λ λ§μ 리λλ€μ΄ (νΉν μμ΄κΆ) μ΄ μ€ν¬μ μμ°μ€λ½κ² μ¬μ©νλκ±Έ λ³΄κ³ λ°°μ°λ €κ³ λ Έλ ₯ λ§μ΄ νλ€. κ·Έλ°λ° μμ§λ TMI μ ν ν°λ μ€ν μ μ μ ꡬλΆν΄μ νμ©νκΈ° μ°Έ μ΄λ ΅λ€. μ΄μ¨λ ν€μλλ₯Ό νλ¦¬λ €κ³ λ Έλ ₯νλ©΄, μ£Όλ³μ΄λ€μ μ€μ€νλ μν©λ€μμ μμΉ« λκΈΈλ§ν λνκ° μ°κ²°λκ³ λΌν¬λ₯Ό λμ± μ½κ² λ§λ€ μ μλ€.
https://loopward.com/improve-conversation-skills-using-conversational-threads-and-sharing-experiences/
https://www.facebook.com/1150372185/posts/pfbid02ke1dLH2EPwSGkNSSGVL7NutMUkGN5ADNT2Zzeh3cQE8BK1rmHNoiGwz75kVT22v8l/?mibextid=jf9HGS
μμ λΌμ΄μ§μ΄λΌλ λ§₯λ½μμ Conv. Threading μ΄λ, μλμ μΌλ‘ λμ κ΄ν ν€μλλ€μ λνμ μΆκ°μ λ³΄λ‘ ν리λ―λ‘μ¨ μλλ°©(νΉμ κ·Έλ£Ήλ΄ νμΈ)μ΄ κ·Έ ν€μλλ€μ μ€μ€νμ¬ μμ°μ€λ½κ² ν ν°λ μ€ν ν λνκ° μ΄μ΄μ§λλ‘ νλ νμ λλ λνλ².
μλ₯Ό λ€μ΄, κ³ ν₯μ΄ μ΄λμΈμ? λΌκ³ νλ©΄ μΌλ°μ μΌλ‘λ "μ λ¨ μμ²μ΄μ." λΌκ³ λ¨λ΅μΌλ‘ λλΌ μ μλκ±Έ CTλ₯Ό μ νλ μ¬λμ βμ λ¨ μμ²μ΄μ, μ¬μ λ°€λ°λ€μ κ°κΉμ΄λ° μμ²λ§μ΅μ§λ‘ μ λͺ νκ³ μννμ΅ νμλ λΆλ€μ μ±μ§μμ.β λΌκ³ μκΈ°λ₯Ό νλ€. κ·Έλ¬λ©΄ κ·Έ κ·Έλ£Ήμμ λꡬλ μ°λ λλ₯Ό μ΄μ΄κ° μ μλ€. μ¬μ λ°€λ°λ€ λ Έλ μκΈ°λ₯Ό ν μλ, μ¬μ μ¬νκ° μκΈ°λ₯Ό ν μλ, μ΅μ§ μκΈ°λ μννμ΅μ λν μ§λ¬Έμ ν μλμλ€.
κ³Όκ±° λ΄κ² μν₯μ 쀬λ λ§μ 리λλ€μ΄ (νΉν μμ΄κΆ) μ΄ μ€ν¬μ μμ°μ€λ½κ² μ¬μ©νλκ±Έ λ³΄κ³ λ°°μ°λ €κ³ λ Έλ ₯ λ§μ΄ νλ€. κ·Έλ°λ° μμ§λ TMI μ ν ν°λ μ€ν μ μ μ ꡬλΆν΄μ νμ©νκΈ° μ°Έ μ΄λ ΅λ€. μ΄μ¨λ ν€μλλ₯Ό νλ¦¬λ €κ³ λ Έλ ₯νλ©΄, μ£Όλ³μ΄λ€μ μ€μ€νλ μν©λ€μμ μμΉ« λκΈΈλ§ν λνκ° μ°κ²°λκ³ λΌν¬λ₯Ό λμ± μ½κ² λ§λ€ μ μλ€.
https://loopward.com/improve-conversation-skills-using-conversational-threads-and-sharing-experiences/
https://www.facebook.com/1150372185/posts/pfbid02ke1dLH2EPwSGkNSSGVL7NutMUkGN5ADNT2Zzeh3cQE8BK1rmHNoiGwz75kVT22v8l/?mibextid=jf9HGS
Loopward
Conversation Threading: 4 Steps & Examples to Improve Your Social Skills - Loopward
How can you improve your conversation skills? By learning how to use conversational threads and learning how to talk about your own experiences. Click here!
π3
μ¬μ±λ¦¬λλΆλ€μ μν μ΄μΌκΈ°--
μ¬μ± 리λλ€κ³Όμ ν ν¬ λͺ¨μμ΄ μμλ€. μ κ² λ¨μ±μ κ΄μ μμ μ¬μ±λ¦¬λμκ² μ΄μΌκΈ°λ₯Ό ν΄ λ¬λΌκ³ ν΄μ ν μ΄μΌκΈ° μ€ λͺκ°μ§λ₯Ό μ 리νλ©΄~
0. μ¬μ±μΌλ‘ 컀리μ΄λ₯Ό μλλ€λ κ²μ ν¨μ¬ κΈ°μΈμ΄μ§ μ΄λμ₯μμ νλ μ΄νλ κ²μμ λΆλͺ νλ€. λ€ννλ μ¬νλ³νμ λ°λΌ μ‘°κΈμ© λμμ§λ λ― νλ€.
1. λ¨μ± 리λλ₯Ό νλ΄λ΄μ§λ§κ³ μ¬μ±μΌλ‘μμ κ°μ μ νμ©νλκ² μ΄λ¨κΉμ
- κ³Όκ±°μλ λ¨μκ°μ μ€νμΌ, λ¨μλ³΄λ€ λ μ μ¬μ±λ€μ΄ 리λλ‘ μ ν©νλ€κ³ μ¬κΉ
- λΆνμ€νκ³ λ€μν μλ, 곡κ°, μν, ν¬μ©μ 리λμμ΄ μ€μν μ΄λ μ¬μ±μ κ°μ μ΄ λ¦¬λλ‘μ μ μ νμν΄μ§
- κ·Έλ¬λ―λ‘ μ¬μ±μΌλ‘μμ κ°μ μ λ§μκ» λ°ννμ.
2. μμ κ° κ°μ§κ³ νννμ
- λΆλλ½κ³ 곡κ°λ ₯μ΄ μλ€λκ²κ³Ό μμ κ°μ΄ μλ€λ κ²μ λ€λ¦. λΆλλ¬μλ ν¨λΆλ‘ λνλ μ΄λ€μκ² λ¨νΈν μ μκ³ λ§€μ¬ μμ κ°μ κ°λμ°°μ μμ΄μ.
- λ무 κ²Έμνκ³ μ보νμ§ λ§κ³ λΉλΉνκ³ μμ κ°μ κ°μ§μλ€. λκ° λΉμ λ³΄λ€ μ€λ ₯μλ λ¨μ±λ€μ΄ ν¨μ¬ λ μμ κ°μ μΆ©λ§νλ€.
3. λ ν° μ± μ, 리λμ, νλ‘μ νΈλ₯Ό νμ₯ν©μλ€.
- R&Rμ μ½λ§€μ΄κ³ μ£Όμ νκΈ°λ³΄λ€ μ±μ₯ν μ μκ³ κΈ°μ¬ν μ μλ νλ‘μ νΈ, μ± μμ κ³Όκ°ν μ·¨νμλΌ.
4. μμ μκ² μ± μλ리μ§λ§κ³ , μνκ±°λ νλ°νλ κ°μ μ λΉ λ₯΄κ² ν볡ν©μλ€.
- μμ μ ννκ±°λ μμ μκ² μ± μμ λλ¦¬μ§ λ§μΈμ. λΉμ μ μλͺ»μ΄ μλμμ.
- κ°μ μ λμ κ²μ΄ μμΌλ λλ €μ, μ€λ§, μ¬ν, λΆλ Έ λ±μ κ°μ μ μ€λ λκ±°λ λ무 κ°νκ² νμΆνκΈ° 보λ€λ (μ΄λ, λͺ μ, κ±·κΈ° λ±) μ€νΈλ μ€ ν΄μλ²μ λ§λ€μ΄ λΉ λ₯΄κ² ν볡νμΈμ.
5. μλ²½μ£Όμλ₯Ό λ¨μ³λ²λ¦½μλ€.
- μ λ΅μ μΌλ‘ 무λ₯νμΈμ
- λͺ¨λ κ²μ μνλ € ν νμλ μμ΄μ.
- μΈμμ μμ νλ― μνλ³΄λ― μ΄ νμκ° μλμ. μ§μ₯, κ°μ , μΉμ², μλ, μ¬ν λ± λͺ¨λ κ²μ 100μ λ§μΌλ € νλ©΄ λ무 νλ€μ£ .
6. κ΄΄λ‘νλ μμ¬, νλ μ¬λμ κΈνΌνκ² λ΄ μλ€.
- μ§μ§ μμμ€λ μκ°λ§νΌ λ³λ‘μλ€. μκ³ λ³΄λ©΄ λκ° νλ²ν μμ μ¨, μμ€λ§μΌλΏ.
- λ€ μμ‘΄νκΈ°μν΄ λΆν¬νλκ²μΌμ μμΌλ κΈνΌμ λμΌλ‘ 보μ.
7. λκ΅°κ°μ 무μμ΄ μλλΌ μ€μ€λ‘μ κΈΈμ κ°μλ€.
- λ무 빨리 μμ μ 컀리μ΄λ₯Ό ν¬κΈ°νμ§ λ§μλ€. λμ€μ νννλ κ²½μ°κ° λ λ§λκ΅°μ.
- λκ΅°κ°μ 무μλ³΄λ€ μμ μ 컀리μ΄μ μΆμ μ½μλ€.
p.s. ν νμΉμ΄ μΈκΈνλ°μ κ°μ΄ μ λ ꡬ쑰μ μΈ κ΄μ 보λ€λ κ°μΈμ κ΄μ μμ λ§μλλ Έμ΅λλ€. μλ§ κ°μΈμΌλ‘μ μ΄λ»κ² λ Έλ ₯νλΌλ κ²λ³΄λ€ μ¬μ€ μ¬ν ꡬ쑰μ μΈμκ³Ό 체κ³μ λ³νκ° λ νμν κ²μ λλ€.
https://www.facebook.com/100006237757461/posts/pfbid02Nuwoy8XghsAmtxmR8vfCRi6naXNwMYVS5V4qybY4pgjbturAb1RHV26YSLYQBoXHl/?mibextid=jf9HGS
μ¬μ± 리λλ€κ³Όμ ν ν¬ λͺ¨μμ΄ μμλ€. μ κ² λ¨μ±μ κ΄μ μμ μ¬μ±λ¦¬λμκ² μ΄μΌκΈ°λ₯Ό ν΄ λ¬λΌκ³ ν΄μ ν μ΄μΌκΈ° μ€ λͺκ°μ§λ₯Ό μ 리νλ©΄~
0. μ¬μ±μΌλ‘ 컀리μ΄λ₯Ό μλλ€λ κ²μ ν¨μ¬ κΈ°μΈμ΄μ§ μ΄λμ₯μμ νλ μ΄νλ κ²μμ λΆλͺ νλ€. λ€ννλ μ¬νλ³νμ λ°λΌ μ‘°κΈμ© λμμ§λ λ― νλ€.
1. λ¨μ± 리λλ₯Ό νλ΄λ΄μ§λ§κ³ μ¬μ±μΌλ‘μμ κ°μ μ νμ©νλκ² μ΄λ¨κΉμ
- κ³Όκ±°μλ λ¨μκ°μ μ€νμΌ, λ¨μλ³΄λ€ λ μ μ¬μ±λ€μ΄ 리λλ‘ μ ν©νλ€κ³ μ¬κΉ
- λΆνμ€νκ³ λ€μν μλ, 곡κ°, μν, ν¬μ©μ 리λμμ΄ μ€μν μ΄λ μ¬μ±μ κ°μ μ΄ λ¦¬λλ‘μ μ μ νμν΄μ§
- κ·Έλ¬λ―λ‘ μ¬μ±μΌλ‘μμ κ°μ μ λ§μκ» λ°ννμ.
2. μμ κ° κ°μ§κ³ νννμ
- λΆλλ½κ³ 곡κ°λ ₯μ΄ μλ€λκ²κ³Ό μμ κ°μ΄ μλ€λ κ²μ λ€λ¦. λΆλλ¬μλ ν¨λΆλ‘ λνλ μ΄λ€μκ² λ¨νΈν μ μκ³ λ§€μ¬ μμ κ°μ κ°λμ°°μ μμ΄μ.
- λ무 κ²Έμνκ³ μ보νμ§ λ§κ³ λΉλΉνκ³ μμ κ°μ κ°μ§μλ€. λκ° λΉμ λ³΄λ€ μ€λ ₯μλ λ¨μ±λ€μ΄ ν¨μ¬ λ μμ κ°μ μΆ©λ§νλ€.
3. λ ν° μ± μ, 리λμ, νλ‘μ νΈλ₯Ό νμ₯ν©μλ€.
- R&Rμ μ½λ§€μ΄κ³ μ£Όμ νκΈ°λ³΄λ€ μ±μ₯ν μ μκ³ κΈ°μ¬ν μ μλ νλ‘μ νΈ, μ± μμ κ³Όκ°ν μ·¨νμλΌ.
4. μμ μκ² μ± μλ리μ§λ§κ³ , μνκ±°λ νλ°νλ κ°μ μ λΉ λ₯΄κ² ν볡ν©μλ€.
- μμ μ ννκ±°λ μμ μκ² μ± μμ λλ¦¬μ§ λ§μΈμ. λΉμ μ μλͺ»μ΄ μλμμ.
- κ°μ μ λμ κ²μ΄ μμΌλ λλ €μ, μ€λ§, μ¬ν, λΆλ Έ λ±μ κ°μ μ μ€λ λκ±°λ λ무 κ°νκ² νμΆνκΈ° 보λ€λ (μ΄λ, λͺ μ, κ±·κΈ° λ±) μ€νΈλ μ€ ν΄μλ²μ λ§λ€μ΄ λΉ λ₯΄κ² ν볡νμΈμ.
5. μλ²½μ£Όμλ₯Ό λ¨μ³λ²λ¦½μλ€.
- μ λ΅μ μΌλ‘ 무λ₯νμΈμ
- λͺ¨λ κ²μ μνλ € ν νμλ μμ΄μ.
- μΈμμ μμ νλ― μνλ³΄λ― μ΄ νμκ° μλμ. μ§μ₯, κ°μ , μΉμ², μλ, μ¬ν λ± λͺ¨λ κ²μ 100μ λ§μΌλ € νλ©΄ λ무 νλ€μ£ .
6. κ΄΄λ‘νλ μμ¬, νλ μ¬λμ κΈνΌνκ² λ΄ μλ€.
- μ§μ§ μμμ€λ μκ°λ§νΌ λ³λ‘μλ€. μκ³ λ³΄λ©΄ λκ° νλ²ν μμ μ¨, μμ€λ§μΌλΏ.
- λ€ μμ‘΄νκΈ°μν΄ λΆν¬νλκ²μΌμ μμΌλ κΈνΌμ λμΌλ‘ 보μ.
7. λκ΅°κ°μ 무μμ΄ μλλΌ μ€μ€λ‘μ κΈΈμ κ°μλ€.
- λ무 빨리 μμ μ 컀리μ΄λ₯Ό ν¬κΈ°νμ§ λ§μλ€. λμ€μ νννλ κ²½μ°κ° λ λ§λκ΅°μ.
- λκ΅°κ°μ 무μλ³΄λ€ μμ μ 컀리μ΄μ μΆμ μ½μλ€.
p.s. ν νμΉμ΄ μΈκΈνλ°μ κ°μ΄ μ λ ꡬ쑰μ μΈ κ΄μ 보λ€λ κ°μΈμ κ΄μ μμ λ§μλλ Έμ΅λλ€. μλ§ κ°μΈμΌλ‘μ μ΄λ»κ² λ Έλ ₯νλΌλ κ²λ³΄λ€ μ¬μ€ μ¬ν ꡬ쑰μ μΈμκ³Ό 체κ³μ λ³νκ° λ νμν κ²μ λλ€.
https://www.facebook.com/100006237757461/posts/pfbid02Nuwoy8XghsAmtxmR8vfCRi6naXNwMYVS5V4qybY4pgjbturAb1RHV26YSLYQBoXHl/?mibextid=jf9HGS
Facebook
Log in to Facebook
Log in to Facebook to start sharing and connecting with your friends, family and people you know.
Forwarded from YM리μμΉ
μκ΅ 4μ‘°μ ν¬μ
ν΄ μλ£μΈλ ₯ λκ·λͺ¨ νμΆ©β¦μλ μ μ λλ°°λ‘
https://naver.me/FXrMhiDg
https://naver.me/FXrMhiDg
Naver
μκ΅ 4μ‘°μ ν¬μ
ν΄ μλ£μΈλ ₯ λκ·λͺ¨ νμΆ©β¦μλ μ μ λλ°°λ‘
창립 75μ£Όλ
NHS, μ½λ‘λ19 κ²ͺμΌλ©° μ΅λ μκΈ°β¦μμ¬λ€λ νμ
AI κΈ°μ λμ
Β·νμ₯μ€μ΅ νλβ¦μ κ· 30λ§λͺ
ν보 μ΅μ€μ νΉνμ = μκ΅ μ λΆκ° 5λ
κ° κ΅λ―Όλ³΄κ±΄μλΉμ€(NHS)μ 24μ΅νμ΄λ(μ½ 4μ‘°μ)λ₯Ό ν¬μ
νκ³
Forwarded from μ μ’
νμ μΈμ¬μ΄νΈ
AIμ ν«ν μ£Όμ λ€μ κ΄ν λ΄μ©μΈλ°, νλνλ λ€ μκ°ν΄λ΄μΌνλ λΆλΆλ€μ΄κ³ κ°μΈμ μΌλ‘λ μ λΆ λμ.
κ·Έλ¦¬κ³ λ¬΄μ보λ€λ κ²°λ‘ λ΄μ© λ무 λ§λλ§μ΄λ€... μμ λ΄ μ΅μ λ΄μ€λ ν° λ΅κ΅°.
λ€λ₯Έ μ¬λλ€μ μ견μ μ΄μ¬ν λ£λ, νλ¨μ κ²°κ΅ μ€μ€λ‘ λ΄λ €μΌνλ€.
https://luttig.substack.com/p/hallucinations-in-ai
κ·Έλ¦¬κ³ λ¬΄μ보λ€λ κ²°λ‘ λ΄μ© λ무 λ§λλ§μ΄λ€... μμ λ΄ μ΅μ λ΄μ€λ ν° λ΅κ΅°.
λ€λ₯Έ μ¬λλ€μ μ견μ μ΄μ¬ν λ£λ, νλ¨μ κ²°κ΅ μ€μ€λ‘ λ΄λ €μΌνλ€.
https://luttig.substack.com/p/hallucinations-in-ai
Forwarded from Buff
20-30κ° νμ¬ λ―Έν
νκΈ° μ 리 by λꡬμ²μ¬λ
μΆμ²: https://blog.naver.com/tosoha1/223143233920
1. ν΄μΈ μμΆμ μ€μ¬μΌλ‘ μ±μ₯νλ μ€μν νμ₯ν μ°μ μ μν νμ¬λ€μ μ§μλ μ±μ₯μ 보μ¬μ£Όκ³ μλ λ― νλ€.
2. λ§μ μλ£κΈ°κΈ°λ λ―Έμ©μ κ΄λ ¨λ νμ¬λ€μ΄ μμΆ μ€μ¬μΌλ‘ μ§μ μ±μ₯ μ€μ΄μλ€.
3. μ κ°κ° μ§μ§μ§λΆμ§ν΄μ μλμ μΌλ‘ μμ£Όκ° μ½ν΄μ§λκ² μλκΉ μ°λ €νλ μμ νν κΈ°μμ¬ νμ¬λ€λ κ³μ μνΈν μ ν©μ μ΄μ΄κ°κ³ μμλ€.
4. μ‘°μ μ°μ μμ 2~3λ κ°μ κ½μ°¬ μμ£Ό μκ³ λ₯Ό ν΅ν΄ ν₯ν μνΈν μ κ°μ λν μμ£Όλ₯Ό μ΄μΌκΈ° νκ³ μμλ€.
5. μ λ ₯κΈ°κΈ° νμ¬λ€μ μ ν©μ μλμ μΌλ‘ μλ λ³΄λ€ λ μ’μμ§ λλμ΄μλ€. μ΄μ λ μ₯κΈ° μ¬μ΄ν΄μ μ‘°μ¬μ€λ μ΄μΌκΈ° νλ λ― νλ€.
6. νΌν¬μμ μ°λ €κ° νμ μλ μλμ°¨ μΉν° μμ μμ§μ μ’μ μ€μ μ μ μ§ μ€.
7. λ°©μ° κ΄λ ¨ νμ¬λ€λ μλ μ κ°ν λͺ¨λ©ν μ μλμ§λ§ μλμ μΌλ‘ μνΈν μ ν©μ μ§μμ€μ΄μλ€.
8. λ€ μ£½μ΄κ°λ λ°λ체 μ°μ λ΄ νμ¬λ€λ 1,2λΆκΈ°λ΄ μ μ ν΅κ³Όμ λν΄ λλ¦ μμ νκ³ μλ λ― νλ€.
μΆμ²: https://blog.naver.com/tosoha1/223143233920
1. ν΄μΈ μμΆμ μ€μ¬μΌλ‘ μ±μ₯νλ μ€μν νμ₯ν μ°μ μ μν νμ¬λ€μ μ§μλ μ±μ₯μ 보μ¬μ£Όκ³ μλ λ― νλ€.
2. λ§μ μλ£κΈ°κΈ°λ λ―Έμ©μ κ΄λ ¨λ νμ¬λ€μ΄ μμΆ μ€μ¬μΌλ‘ μ§μ μ±μ₯ μ€μ΄μλ€.
3. μ κ°κ° μ§μ§μ§λΆμ§ν΄μ μλμ μΌλ‘ μμ£Όκ° μ½ν΄μ§λκ² μλκΉ μ°λ €νλ μμ νν κΈ°μμ¬ νμ¬λ€λ κ³μ μνΈν μ ν©μ μ΄μ΄κ°κ³ μμλ€.
4. μ‘°μ μ°μ μμ 2~3λ κ°μ κ½μ°¬ μμ£Ό μκ³ λ₯Ό ν΅ν΄ ν₯ν μνΈν μ κ°μ λν μμ£Όλ₯Ό μ΄μΌκΈ° νκ³ μμλ€.
5. μ λ ₯κΈ°κΈ° νμ¬λ€μ μ ν©μ μλμ μΌλ‘ μλ λ³΄λ€ λ μ’μμ§ λλμ΄μλ€. μ΄μ λ μ₯κΈ° μ¬μ΄ν΄μ μ‘°μ¬μ€λ μ΄μΌκΈ° νλ λ― νλ€.
6. νΌν¬μμ μ°λ €κ° νμ μλ μλμ°¨ μΉν° μμ μμ§μ μ’μ μ€μ μ μ μ§ μ€.
7. λ°©μ° κ΄λ ¨ νμ¬λ€λ μλ μ κ°ν λͺ¨λ©ν μ μλμ§λ§ μλμ μΌλ‘ μνΈν μ ν©μ μ§μμ€μ΄μλ€.
8. λ€ μ£½μ΄κ°λ λ°λ체 μ°μ λ΄ νμ¬λ€λ 1,2λΆκΈ°λ΄ μ μ ν΅κ³Όμ λν΄ λλ¦ μμ νκ³ μλ λ― νλ€.
NAVER
λ§μ κΈ°μ
λ€μ λ―Έν
μ νκ³ λμ...
μ΅κ·Ό 2~3μ£Όκ° κ±°μ ν루μ νκ· 3~4κ° μ΄μμ κΈ°μ
λ€κ³Ό 컨μ½μ νκ² λμλ€.
Lessons gleaned from my wise bro
- Trust is built through a tapestry of gentle acts and kept promises.
Doug Leone of Sequoia Capital
Trust is forged through a blend of sincerity and excellence. In the absence of either, securing trust becomes challenging. Integrity is essential for fostering a healthy work environment; without it, you risk alienating your team. On the other hand, a lack of excellence hampers trust-building. It sets the tone and vision for a company. Once the company's direction is determined, it's crucial to attract the talent required to turn that vision into reality.
Jeff Bezos:
The founder of Amazon shared his 4-step method for building trust and reputation:
1. Do hard things: Earn trust by doing hard things well over and over again.
2. If you say you're going to do something, do it: Keep your promises and commitments.
3. Take controversial stances: Be willing to take risks and make tough decisions.
4. Have clarity: Be clear in your communication and decision-making.
- Trust is built through a tapestry of gentle acts and kept promises.
Doug Leone of Sequoia Capital
Trust is forged through a blend of sincerity and excellence. In the absence of either, securing trust becomes challenging. Integrity is essential for fostering a healthy work environment; without it, you risk alienating your team. On the other hand, a lack of excellence hampers trust-building. It sets the tone and vision for a company. Once the company's direction is determined, it's crucial to attract the talent required to turn that vision into reality.
Jeff Bezos:
The founder of Amazon shared his 4-step method for building trust and reputation:
1. Do hard things: Earn trust by doing hard things well over and over again.
2. If you say you're going to do something, do it: Keep your promises and commitments.
3. Take controversial stances: Be willing to take risks and make tough decisions.
4. Have clarity: Be clear in your communication and decision-making.