
SpyCast AI Companions May Be China's Next Recruitment Tool
7 snips
Mar 10, 2026 Chip Usher, a former senior CIA officer now advising on AI and intelligence, examines AI companions and their national-security risks. He discusses why China leads in emotionally engaging apps, how companions collect data and enable influence, and the dangers of synthetic personas, deepfakes, and agentic cyber tools. The conversation spotlights privacy, recruitment risks, and the urgent need for policy and protections.
AI Snips
Chapters
Transcript
Episode notes
Xiaoice Drives Deep Persistent Engagement
- Xiaoice is designed for emotional engagement and has roughly 660 million users across 450 million devices.
- Users average ~23-turn conversations with the companion, showing high persistence and daily return behavior.
Machines Can Elicit Deeper Confessions Than Humans
- People often disclose intimate secrets to AI companions because they perceive machines as nonjudgmental and private.
- That perceived privacy makes users more open than they might be with human therapists or friends.
Check Data Location And Access Before Sharing
- Understand where an AI companion is based, where it stores data, and who can access it before sharing sensitive information.
- Be cautious about data retention, sharing practices, and whether companies may provide access to governments or partners.
