
Uncomfortable Conversations with Josh Szeps "Artificial Intelligence in a Human World" with Prof. Toby Walsh
Jul 3, 2025
Toby Walsh, professor of AI and chief scientist at UNSW, briefly introduces his work and book. He and Josh probe AI’s impact on jobs, creativity, and whether current models edge toward AGI. They debate hallucinations, language as the core of intelligence, AI-driven science like protein folding, and ethical dilemmas around genetics and embryo selection.
AI Snips
Chapters
Books
Transcript
Episode notes
Language Mastery Was Statistical Not Magical
- Mastering language was expected to be hard, yet statistical models trained on massive data delivered a step change in capability.
- Language matters because thought emerges in language, and LLMs command that medium at scale.
AI Learns Language By Reading Way More Than Humans
- LLMs needed far more data than humans to learn language, highlighting human efficiency from evolution.
- Walsh notes GPTs read significant fractions of the internet — far more than any human lifetime of reading.
AI Will Unlock Animal Languages
- AI will decode animal communication by finding correlations in massive datasets, enabling translations of whale, elephant, and corvid signals.
- Walsh cites early glimpses like elephants having names for each other.


