Untangled

Charley Johnson
undefined
Jan 5, 2025 • 43min

Is tech a religion? Is Elon Musk a hungry ghost?

Today, I’m sharing my conversation with Greg Epstein, American Humanist chaplain at Harvard University and the Massachusetts Institute of Technology, and author of the great new book Tech Agnostic: How Technology Became the World’s Most Powerful Religion, and Why It Desperately Needs a Reformation. We discuss:* How tech is becoming a religion, and why it’s connected to our belief that we’re never enough.* How Elon Musk, Mark Zuckerberg, Jeff Bezos, and Bill Gates are hungry ghosts. * What ‘tech-as-religion’ allows us to see and understand that ‘capitalism-as-religion’ doesn’t.* My concerns with the metaphor and Greg’s thoughtful response.* How we might usher in a tech reformation, and the tech humanists leading the way.* The value of agnosticism and not-knowing when it comes to tech.Okay, that’s it for now,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Dec 20, 2024 • 35min

Can we democratize AI?

Today, I’m sharing my conversation with Divya Siddarth, Co-Founder and Executive Director of the Collective Intelligence Project (CIP) about how we might democratize the development and governance of AI. We discuss:* The CIP’s work on alignment assemblies with Anthropic and OpenAI — what they’ve learned, and why in the world a company would agree to increasing public participation.* The #1 risk of AI as ranked by the public. (Sneak peek: it has nothing to do with rogue robots.)* Are participatory processes good enough to bind companies to the decisions they generate? * How we need to fundamentally change our conception of ‘AI expertise.’* How worker and public participation can shift the short-term thinking and incentives driving corporate America.* Should AI companies become direct democracies or representative ones? * How Divya would structure public participation if she had a blank sheet of paper and if AI companies had to adopt the recommendations.That’s it for now,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Dec 15, 2024 • 30min

Building healthy local communities online & off

Today, I’m sharing my conversation with Deepti Doshi, Co-Director of New_Public about what they’ve learned building local healthy communities, online and off. We discuss:* The problem New_Public is trying to address with their initiative, Local Lab. (Which I highlighted in my recent essay, “Fragment the media! Embrace the shards!”)* What Deepti has learned about what makes for pro-social conversations that build community on messaging boards and private groups.* Why it’s an oxymoron to call Twitter a ‘global town square’ and the relationship between scale and trustworthy information ecosystems.* The importance of ‘digital stewards’ in facilitating online community.* How the social capital people build online is translating into IRL actions and civic engagement.* What a future might look like if New_Public realizes the vision of Local Lab.That’s it for now,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Dec 8, 2024 • 32min

Block, build, be

This week, I’m sharing my conversation with Anya Kamenetz, the creator of The Golden Hour, a newsletter about “thriving and caring for others on a rapidly changing planet. Anya and I announced a new partnership recently — now, when you sign up for an annual paid subscription to Untangled, you’ll get free access to the paid version of The Golden Hour — and we wanted to talk about it, and the work ahead.Along the way, we also discuss:* How we’re adapting our newsletters in response to the election.* Why mitigating harms isn’t sufficient, and a framework that can help us all orient to the present moment: block, build, be.* How we consume information — our mindsets, habits, and practices — and also, why ‘consume’ isn’t the right frame. * The difference between social media connections and email-based relationships.* How to talk to your kids about the election.* The fragmentation of the news media environment and why it’s a good thing.I couldn’t be more excited to partner with Anya and introduce you to her work. Enjoy!More soon,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Nov 24, 2024 • 42min

Is AI snake oil?

Hi, I’m Charley, and this is Untangled, a newsletter about our sociotechnical world, and how to change it.* Come work with me! The initiative I lead at Data & Society is hiring for a Community Manager. Learn more here.* Check out my new course, Sociotechnical Systems Change in Practice. The first cohort will take place on January 11 and 12, and you can sign up here.* Last week I interviewed Mozilla’s Jasmine Sun and Nik Marda on the potential of public AI, and the week prior I shared my conversation with AI reporter Karen Hao on OpenAI’s mythology, Meta’s secret, and Microsoft’s hypocrisy.🚨 This is your last chance to get Untangled 40 percent off. Even better, I partnered with Anya Kamenetz to offer you her great newsletter The Golden Hour for free! Signing up for Untangled right now means you’ll get $140 in value for $54.On to the show!This week I spoke with Arvind Narayanan, professor of computer science at Princeton University and director of its Center for Information Technology Policy. I spoke with Arvind about his great new book with Sayash Kapoor, AI Snake Oil: What Artificial Intelligence Can Do, What it Can’t, and How to Tell the Difference. We discuss:* The difference between generative AI and predictive AI, and why we’re both more concerned by the latter.* Whether generative AI systems can ‘understand’ and ‘reason.’* The difference between intelligence and power and why Arvind isn’t so concerned by the supposed existential threats of AI.* Why artificial intelligence appeals to broken institutions.* How Arvind would change AI discourse.* How technical and social experts misunderstand one another.* What a Trump second term means for AI regulation.* What excites Arvind about how his children will experience new technologies, and what makes him nervous.More soon,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Nov 17, 2024 • 40min

The potential of public AI

Hi, I’m Charley, and this is Untangled, a newsletter about our sociotechnical world, and how to change it.* Untangled crossed the 8,000 subscriber mark this week. Woot!* Come work with me! The initiative I lead at Data & Society is hiring for a Community Manager. Learn more here. * Last week, I shared my conversation with award-winning AI reporter Karen Hao on OpenAI’s mythology, Meta’s secret, and Microsoft’s hypocrisy.* I launched my new course, Sociotechnical Systems Change in Practice. The first cohort will take place on January 11 and 12, and you can sign up here. (As you’ll see, I’ve decided to offer a free 1:1 coaching session to all participants following the course.)🚨Untangled is 40 percent off (this is the largest discount I’ve offered, and it will end in two weeks), and I partnered with Anya Kamenetz to offer you her great newsletter The Golden Hour for free! Signing up for Untangled right now means you’ll get $140 in value for $54.On to the show!This week I’m sharing a conversation with Jasmine Sun and Nik Marda on the potential of public AI. We recorded the conversation before the election. It might seem like an odd conversation to pipe into your earbuds now. Yes, the world looks differently than it did then. But AI should still serve our collective goals, it should be shaped by our participation, and it should be accountable to us. Right, the ‘public’ doesn’t just mean the government — it means us! As civil rights groups and policy advocates prepare to play defense over the next four years, we must also articulate an affirmative vision of the future, and work to ensure our technologies serve it, and us. Nik and Jasmine’s paper — and this discussion — offer helpful guide to building that future.We discuss:* What ‘public AI is, and the importance of articulating an affirmative vision of the future we want to create.* The three core attributes that animate public AI — public goods, public orientation, and public use — and what would need to change to realize its potential.* Shifting how we collectively understand AI — what it is, what it’s not, what it can do, what it can’t.* How our public imagination tends to conjure AI extremes — utopias where no one has to work and and dystopias where AI somehow, someway, tripwires an existential event — and what a ‘public AI’ future might look like.Okay, that’s it for now,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Nov 10, 2024 • 36min

OpenAI’s mythology, Meta’s secret, and Microsoft’s hypocrisy.

Hi, I’m Charley, and this is Untangled, a newsletter about our sociotechnical world, and how to change it.* Last week, I argued that the shared reality that the U.S. has long glorified was predominantly white and male, and historically, fragmentation has proven to be a good thing.* I launched my new course, Sociotechnical Systems Change in Practice. The first cohort will take place on January 11 and 12, and you can sign up here. (As you’ll see, I’ve decided to offer a free 1:1 coaching session to all participants following the course.)* Untangled is 40 percent off at the moment, and I partnered with Anya Kamenetz to offer you her great newsletter The Golden Hour for free! Check out her latest on how to talk to your kids about the election. Signing up for Untangled right now means you’ll get $140 in value for $54.This week, I’m sharing my conversation with Karen Hao, an award-winning writer covering artificial intelligence for The Atlantic. We discuss:* Karen’s investigation into Microsoft’s hypocrisy on AI and climate change.* How OpenAI’s mythology reminds Karen of Dune. (I can’t stop thinking about the connection after Karen made it.)* How Meta uses shell companies to hide from community scrutiny when building new data centers.* How AI discourse should change and what Karen is doing to train journalists on how to report on AI.* How to shift power within tech companies. Employee organizing? Community advocacy? Reporting that rejects narratives premised on future promises and innovation for its own sake? Yes.Reflections on the last weekI interviewed Karen on the morning of the election. I hesitated to share the episode this Sunday but ultimately decided to release it because it’s a conversation about big, structural problems, and what we can do about them. The election results affirm for me the pivot I announced a few weeks ago. Namely, we can’t solve existing problems or fix broken institutions such that they return us to the status quo. We’re (still!) not going back. We have to transform existing sociotechnical systems as we address the rot that lies beneath. We must imagine alternative futures and align our individual and collective actions to them. We have to live these futures today, and then tomorrow. One day at a time,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Sep 8, 2024 • 32min

A world overrun by AI agents w/Evan Ratliff

Hi, it’s Charley, and this is Untangled, a newsletter about technology, people, and power. Today, I’m sharing my conversation with Evan Ratliff, journalist and host of Shell Game, a funny and provocative new podcast about “things that are not what they seem.” Evan cloned his voice, hitched it to an AI agent, and then put it in conversation with scammers and spammers, a therapist, work colleagues, and even his friends and family. Shell Game helps listeners see a li’l farther into a future overrun with AI agents, and I wanted to speak with Evan about his experience of this future.In our conversation, we discuss:* The hilarity that ensues when Evan’s AI agent engages with scammers and spammers, and the quirks and limitations of these tools.* The harrowing experience of listening to your AI agent make stuff up about you in therapy.* How those building these tools view the problem(s) they’re solving.* What it’s like to send your AI agent to work meetings in you place.* The work required to maintain these tools and make their outputs useful — does it actually help you save time and be more productive??* The lingering uncertainty these tools culitvated through its interactions with Evan’s family and friends.If you find the conversation interesting, share it with a friend.Okay, that’s it for now,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Aug 18, 2024 • 47min

I turned 40 this week.

Hi, it’s Charley, and this is Untangled, a newsletter about technology, people, and power.Can’t afford a subscription and value Untangled’s offerings? Let me know! You only need to reply to this email, and I’ll add you to the paid subscription list, no questions asked.I turned 40 this week and I spent the weekend in nature, surrounded by my favorite people. While my cup is running over with friendship, love, and support, I’ll always take more 🤣. You can celebrate me and my next trip around the sun by becoming a paid subscriber and buying my first book, AI Untangled.This month:* I published an essay about the power of utopian thinking — how one version got us into this AI mess, and getting out will require a very different approach. (Remember, you have until August 31st to submit a vignette of your sociotechnical utopia.)* I shared my conversation with Shannon Vallor, the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI) at the University of Edinburgh. Vallor and I talk about her great new book, The AI Mirror: Reclaiming Our Humanity in an Age of Machine Thinking, and how to chart a new path from the one we’re on.This week, I’m resharing my October 2022 conversation with Brandon Silverman, co-founder and CEO of CrowdTangle, the data analytics tool once at the center of controversy inside Meta over just how transparent the company should be. Meta shut down the tool this week, and we’re all worse for it.In the episode, we get into Brandon’s time at Meta and the fights over CrowdTangle but we spend most of our time exploring his views on transparency — its utility and limitations, its relationship to accountability, power, and trust — and how they have evolved. Along the way, we discuss:* How Brandon initially got “red-pilled” on transparency.* How CrowdTangle challenged the stories Facebook leadership told themselves about the platform’s impact on the world.* How the scale of these platforms means that when it comes to solutions, “it’s tradeoffs all the way down.”This essay pairs nicely with the second-ever essay I wrote for Untangled, “Some Unsatisfying Solutions for Facebook,” which delves into the conceptual limitations of transparency. Just as we should never stop pushing for it, we can’t mistake it for accountability.🙏 Thank YouWhen I turned 39 last year, I wrote this:“I turn 39 today, so perhaps it’s fitting that I’ve been thinking a lot about time. I want time to feel slow and expansive. I want each day to feel justified on its own terms. I want the value of each activity to lie in the doing, not in the end result. That’s what Untangled has been for me. Not always — sometimes writing is the absolute worst — but on a good day, when I sit down at the keyboard, I enjoy the process, and it feels like flow.”I feel closer to this feeling as I turn 40. That’s partly because of you! The other part? Meditation! But the point is, your support allows me to show up to the keyboard every morning before the sun comes up, and write. It affords me glimpses of this feeling, of time slowing down, and joy in the moment. It turns out that enjoying the moment also produces results: last year, I wrote 51 issues and published a book. Thanks for being along for the ride. That’s it for now.Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com
undefined
Aug 14, 2024 • 47min

AI is a mirror. What can it show us?

Hi, it’s Charley, and this is Untangled, a newsletter about technology, people, and power.This week I’m sharing my conversation with Shannon Vallor, the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI) at the University of Edinburgh. Vallor and I talk about her great new book, The AI Mirror: Reclaiming Our Humanity in an Age of Machine Thinking, and how to chart a new path from the one we’re on. We discuss:* The metaphor of an ‘AI mirror’ — what it is, and how it helps us better understand what AI is(and isn’t!)* What AI mirrors reveal about ourselves and our past.* How AI mirrors distort what we see — whose voices and values they amplify, and who is left out of the picture altogether.* How Vallor would change AI discourse.* How we might chart a new path toward a fundamentally different future — as a sneak peak, it requires starting with outcomes and values and thinking backward.* How we can become so much more than the limits subtly shaping our teenage selves (e.g. conceptions of what we’re good at, what we’re not, etc.) — and how that growth and evolution doesn’t have to stop as we age.It’s not hyperbole when I say Vallor’s book is the best thing I’ve read this year. If you send me a picture holding it in one hand, and my new book in the other, I might just explode with joy.More soon,Charley This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit untangled.substack.com

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app