Artificial Intelligence and You

141 - Special Episode: Understanding ChatGPT

Feb 27, 2023
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Transformer Pretraining Fueled GPT Breakthroughs

  • Transformer architectures and pretraining enabled language models to predict long-range context and generate coherent paragraphs.
  • Scott traces Word2Vec to Attention Is All You Need and OpenAI's progression from GPT-2 (1.5B) to GPT-3 (175B) and InstructGPT.
INSIGHT

How GANs Turn Noise Into Convincing Images

  • Image generators used adversarial training to iteratively refine noise into photorealistic images.
  • Scott explains the GAN loop: a generator tweaks noise until a recognizer can't tell the image is synthetic.
ADVICE

Update Testing Methods To Verify Student Work

  • Reinstate assessment methods that verify original thought, like in-person or proctored exams.
  • Scott cites professors switching from take-home essays to supervised pen-and-paper tests after ChatGPT produced indistinguishable answers.
Get the Snipd Podcast app to discover more snips from this episode
Get the app