
Embedded Insiders Edge AI Solutions, The Memory Crisis, and Sustainable AI
Apr 2, 2026
Sean Dougherty, VP at Everspin Technologies, explains MRAM and the memory capacity and pricing crunch from AI demand. Robert Otręba, CEO of GRINN, outlines compact Edge AI system-on-modules and ready-to-deploy SBCs. They discuss sustainable AI's resource toll, practical Edge AI demos at Embedded World, modular hardware trends, and which markets face memory pain.
AI Snips
Chapters
Transcript
Episode notes
Generative AI Drives Large Energy And Water Costs
- Generative large language models impose huge environmental costs through energy and water usage in data centers.
- Ken Briotta warned AI data centers are comparable to a top-20 country in energy drain and often rely on fossil fuels and heavy cooling water usage.
Generative Chatbots Are Sophisticated Guessing Machines
- Ken Briotta argued generative chatbots are statistical plagiarism machines that often provide low-value results and risk legal IP exposure.
- He compared modern chatbots to the 1840s Mechanical Turk con, calling them sophisticated guessing engines rather than true intelligence.
Use AI At The Edge For Practical Tasks
- Focus AI effort on edge use cases like computer vision and object recognition where models add clear product value.
- Ken recommended embedding AI in edge devices for data parsing and statistical analysis rather than relying on cloud chatbots.
