Embedded Insiders

Edge AI Solutions, The Memory Crisis, and Sustainable AI

Apr 2, 2026
Sean Dougherty, VP at Everspin Technologies, explains MRAM and the memory capacity and pricing crunch from AI demand. Robert Otręba, CEO of GRINN, outlines compact Edge AI system-on-modules and ready-to-deploy SBCs. They discuss sustainable AI's resource toll, practical Edge AI demos at Embedded World, modular hardware trends, and which markets face memory pain.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Generative AI Drives Large Energy And Water Costs

  • Generative large language models impose huge environmental costs through energy and water usage in data centers.
  • Ken Briotta warned AI data centers are comparable to a top-20 country in energy drain and often rely on fossil fuels and heavy cooling water usage.
INSIGHT

Generative Chatbots Are Sophisticated Guessing Machines

  • Ken Briotta argued generative chatbots are statistical plagiarism machines that often provide low-value results and risk legal IP exposure.
  • He compared modern chatbots to the 1840s Mechanical Turk con, calling them sophisticated guessing engines rather than true intelligence.
ADVICE

Use AI At The Edge For Practical Tasks

  • Focus AI effort on edge use cases like computer vision and object recognition where models add clear product value.
  • Ken recommended embedding AI in edge devices for data parsing and statistical analysis rather than relying on cloud chatbots.
Get the Snipd Podcast app to discover more snips from this episode
Get the app