
First Principles with Christian Keil #3: Extropic - Why Thermodynamic Computing is the Future of AI (PUBLIC DEBUT)
7 snips
Mar 12, 2024 Explore the innovative concept of Thermodynamic Computing by Extropic, offering a new kind of computer with vast potential. Delve into the challenges of Quantum Computing and the limitations of Gaussian distributions in ML. Learn about harnessing noise with Thermodynamic Computers, disrupting AI from First Principles, and the early applications in probabilistic domains. The podcast dives into scaling digital computers, gaining confidence in the idea over time, and the future of AI with Thermodynamic Computing.
AI Snips
Chapters
Transcript
Episode notes
Curse Of Dimensionality For Distributions
- Representing general probability distributions scales exponentially with dimensionality, making direct storage impractical.
- Sampling and physics-native approaches avoid exponential memory growth by generating samples instead of full tabulations.
Thermodynamics Over Quantum For Probabilistic ML
- Quantum computers are powerful for quantum interference but offer limited practical advantage for classical probabilistic ML given their engineering overhead.
- Extropic aims for large constant-factor speedups using thermodynamic devices rather than asymptotic quantum complexity gains.
Leverage Existing Supply Chains
- Build thermodynamic computers on scalable manufacturing platforms to reach practical deployment in this decade.
- Lean on existing semiconductor supply chains rather than bespoke quantum supply chains.
