
Machine Learning Street Talk (MLST) Sara Hooker - The Hardware Lottery, Sparsity and Fairness
Oct 20, 2020
Sara Hooker, a research scholar at Google Brain and founder of Delta Analytics, dives into the complexities of AI in this discussion. She introduces the 'Hardware Lottery' concept, highlighting how innovation is often dictated by existing technology. The conversation shifts to biases in AI models, emphasizing the need for fairness and interpretability. Sara critiques current methods and advocates for innovative solutions that prioritize model performance in underrepresented groups, bridging the gap between hardware choices and ethical AI development.
AI Snips
Chapters
Books
Transcript
Episode notes
GPUs and Deep Learning's Rise
- Deep learning's rise was partly due to GPUs, initially designed for gaming, being repurposed.
- The development of compatible software ecosystems further fueled its breakthrough.
Hardware Development Inertia
- Hardware development is costly, slow, and commercially driven, hindering co-design with research.
- This creates inertia, locking in existing approaches and making exploration of new hardware paradigms difficult.
Sparsity for Model Compression
- Explore sparsity in neural networks to reduce hardware costs.
- Focus on translating unstructured sparsity to hardware for better compression.




