
LessWrong (30+ Karma) “The Iliad Intensive Course Materials” by Leon Lang, David Udell, Alexander Gietelink Oldenziel
We are releasing the course materials of the Iliad Intensive, a new month-long and full-time AI Alignment course that runs in-person every second month. The course targets students with strong backgrounds in mathematics, physics, or theoretical computer science, and the materials reflect that: they include mathematical exercises with solutions, self-contained lecture notes on topics like singular learning theory and data attribution, and coding problems, at a depth that is unmatched for many of the topics we cover. Around 20 contributors (listed further below) were involved in developing these materials for the April 2026 cohort of the Iliad Intensive.
By sharing the materials, we hope to
- create more common knowledge about what the Iliad Intensive is;
- invite feedback on the materials;
- and allow others to learn via independent study.
We are developing the materials further and plan to eventually release them on a website that will be continuously maintained. We will also add, remove, and modify modules going forward to improve and expand the course over time. When we release a new significantly updated version of the materials, we will update this post to link the new version.
Modules
The Iliad Intensive is structured into clusters, which are [...]
---
Outline:
(01:26) Modules
(02:32) Cluster A: Alignment
(05:00) Cluster B: Learning
(11:00) Cluster C: Abstractions, Representations, and Interpretability
(15:40) Cluster D: Agency
(19:23) Cluster E: Safety Guarantees and their Limits
(23:04) Contributors
(26:36) Impressions from April
(29:02) Acknowledgments
(29:11) Feedback
---
First published:
May 11th, 2026
Source:
https://www.lesswrong.com/posts/dWQnLi7AoKo3paBXF/the-iliad-intensive-course-materials
---
Narrated by TYPE III AUDIO.
---
Images from the article:


Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
