
Training Data OpenAI's Greg Brockman: Why Human Attention Is the New Bottleneck
223 snips
May 1, 2026 Greg Brockman, OpenAI co-founder and chief builder, leads compute, models, and product strategy. He explains why compute is never enough and why we may be 80% of the way to AGI. He describes agentic coding tools moving from 20% to 80% of code, why human attention is becoming the scarcest resource, and how fleets of agents could reshape organizations.
AI Snips
Chapters
Transcript
Episode notes
Compute Demand Far Exceeds Supply
- OpenAI's core business is buying, renting, and reselling compute at positive margin, so demand for compute is effectively unlimited.
- Greg Brockman says they constantly hunt for more GPUs and that when ChatGPT launched he advised buying "all of it" because demand outstrips supply.
Scaling Laws Behave Like Fundamental Truths
- Scaling laws feel fundamental: pour more compute into neural nets and capabilities increase without an obvious wall.
- Brockman frames scaling laws as empirical truths and notes architecture shifts (like transformer) still drive big leaps.
Model Implemented And Optimized A Design Overnight
- A systems engineer handed a design doc to GPT-5.3 and woke up to a finished, profiled, and optimized implementation.
- The model iterated: implemented the spec, added instrumentation, profiled slow spots, and optimized the code overnight.

