
Kim Komando Daily Tech Update AI's political bias
Mar 19, 2026
A warning about relying on AI for voting advice and how political leanings can show up in chatbots. A look at Google's Nano Banana image tool and the quirky naming backstory. A light conversation about branding choices and playful banana jokes. Promotion of newsletter signups sprinkled through the chat.
AI Snips
Chapters
Transcript
Episode notes
Avoid Using AI To Decide Your Vote
- Avoid asking AI which candidates to vote for in elections.
- Kim Komando warns that models like Gemini and ChatGPT show political leanings, so use human research instead of trusting AI voting advice.
Research Ballots Yourself Not With Chatbots
- Do research candidates and ballot measures using human sources rather than relying on AI summaries.
- Kim Komando explicitly advises staying away from AI for voting decisions because models can be biased.
AI Models Have Detectable Political Leanings
- Different AI systems display distinct political tendencies that affect outputs.
- Kim Komando cites Gemini flagging several Republicans for hate speech while ChatGPT admits research finds some models lean left and Grok often leans right.
