
TechCrunch Industry News Hyperscale Power the latest to challenge transformer tech; plus, YouTube expanding AI deepfake detection
Mar 10, 2026
A look at a startup building compact solid-state transformers to shrink power hardware in cramped AI data centers. Discussion of high-frequency designs, seed funding, and why transformer size matters as rack power climbs. Coverage of a new pilot that lets verified politicians, journalists, and officials flag AI-generated likenesses for removal on a major video platform.
AI Snips
Chapters
Transcript
Episode notes
Solid-State Transformers Are Near-Certain Data Center Need
- Solid-state transformers (SSTs) are becoming essential as data center power density rises and traditional iron-core transformers hit practical limits.
- Hyperscale Power claims a novel SST operating at tens of kilohertz to drastically reduce size and footprint compared with existing solutions.
Founder Built 99.1% Efficient Transformer During PhD
- Daniel Rothman built a 99.1% efficient solid-state transformer as part of his PhD work, informing Hyperscale's prototype design.
- That hands-on research pedigree underpins the company's claim of a much smaller, higher-frequency SST.
SST Market Rapidly Attracted Venture Capital
- The SST market quickly attracted hundreds of millions in VC, with competitors like Ampersand, DG Matrix, and Heron Power raising substantial rounds.
- Combined funding and incumbents indicate a fast-moving competitive landscape around SSTs.
