Podcast – Cory Doctorow's craphound.com

Cory Doctorow
undefined
Dec 2, 2019 • 0sec

Party Discipline, a Walkaway story (Part 1)

In my latest podcast (MP3), I’ve started a serial reading of my novella Party Discipline, which I wrote while on a 35-city, 45-day tour for my novel Walkaway in 2017; Party Discipline is a story set in the world of Walkaway, about two high-school seniors who conspire to throw a “Communist Party” at a sheet metal factory whose owners are shutting down and stealing their workers’ final paychecks. These parties are both literally parties — music, dancing, intoxicants — and “Communist” in that the partygoers take over the means of production and start them up, giving away the products they create to the attendees. Walkaway opens with a Communist Party and I wanted to dig into what might go into pulling one of those off. I don’t remember how we decided exactly to throw a Communist party. It had been a running joke all through senior year, whenever the obvious divisions between the semi-zottas and the rest of us came too close to the surface at Burbank High: “Have fun at Stanford, come drink with us at the Communist parties when you’re back on break.” The semi-zottas were mostly white, with some Asians—not the brown kind—for spice. The non-zottas were brown and black, and we were on our way out. Out of Burbank High, out of Burbank, too. Our parents had lucked into lottery tickets, buying houses in Burbank back when they were only ridiculously expensive. Now they were crazy. We’d be the last generation of brown kids to go to Burbank High because the instant we graduated, our parents were going to sell and use the money to go somewhere cheaper, and the leftovers would let us all take a couple of mid-range MOOCs from a Big Ten university to round out our community college distance-ed degrees. MP3
undefined
Dec 1, 2019 • 0sec

Talking with the Left Field podcast about Sidewalk Labs’s plan to build a surveilling “smart city” in Toronto

We’ve been closely following the plan by Google sister company Sidewalk Labs to build a surveilling “smart city” in Toronto; last week, I sat down with the Out of Left Field podcast (MP3) to discuss what’s going on with Sidewalk Labs, how it fits into the story of Big Tech, and what the alternatives might be.
undefined
Nov 27, 2019 • 0sec

Talking Adversarial Interoperability with Y Combinator

Earlier this month while I was in San Francisco, I went over to the Y Combinator incubator to record a podcast (MP3); we talked for more than an hour about the history of Adversarial Interoperability and what its role was in creating Silicon Valley and the tech sector and how monopolization now threatens adversarial interop and also how it fuels the conspiratorial thinking that is so present in our modern politics. We talk about how startup founders and other technologists can use science fiction for inspiration, and about the market opportunities presented by challenging Big Tech and its giant, massively profitable systems.
undefined
Nov 26, 2019 • 0sec

The Engagement-Maximization Presidency

In my latest podcast (MP3), I read my May, 2018 Locus column, “The Engagement-Maximization Presidency,” where I propose a theory to explain the political phenomenon of Donald Trump: we live in a world in which communications platforms amplify anything that gets “engagement” and provides feedback on just how much your message has been amplified so you can tune and re-tune for maximum amplification. Peter Watts’s 2002 novel Maelstrom illustrates a beautiful, terrifying example of this, in which a mindless, self-modifying computer virus turns itself into a chatbot that impersonates patient zero in a world-destroying pandemic; even though the virus doesn’t understand what it’s doing or how it’s doing it, it’s able to use feedback to refine its strategies, gaining control over more resources with which to try more strategies. It’s a powerful metaphor for the kind of cold reading we see Trump engaging in at his rallies, and for the presidency itself. I think it also explains why getting Trump of Twitter is impossible: it’s his primary feedback tool, and without it, he wouldn’t know what kinds of rhetoric to double down on and what to quietly sideline. Maelstrom is concerned with a pandemic that is started by its protago­nist, Lenie Clark, who returns from a deep ocean rift bearing an ancient, devastating pathogen that burns its way through the human race, felling people by the millions. As Clark walks across the world on a mission of her own, her presence in a message or news story becomes a signal of the utmost urgency. The filters are firewalls that give priority to some packets and suppress others as potentially malicious are programmed to give highest priority to any news that might pertain to Lenie Clark, as the authorities try to stop her from bringing death wherever she goes. Here’s where Watt’s evolutionary bi­ology shines: he posits a piece of self-modifying malicious software – something that really exists in the world today – that automatically generates variations on its tactics to find computers to run on and reproduce itself. The more computers it colonizes, the more strategies it can try and the more computational power it can devote to analyzing these experiments and directing its randomwalk through the space of all possible messages to find the strategies that penetrate more firewalls and give it more computational power to devote to its task. Through the kind of blind evolution that produces predator-fooling false eyes on the tails of tropical fish, the virus begins to pretend that it is Lenie Clark, sending messages of increasing convincingness as it learns to impersonate patient zero. The better it gets at this, the more welcoming it finds the firewalls and the more computers it infects. At the same time, the actual pathogen that Lenie Clark brought up from the deeps is finding more and more hospitable hosts to reproduce in: thanks to the computer virus, which is directing public health authorities to take countermeasures in all the wrong places. The more effective the computer virus is at neutralizing public health authorities, the more the biological virus spreads. The more the biological virus spreads, the more anxious the public health authorities become for news of its progress, and the more computers there are trying to suck in any intelligence that seems to emanate from Lenie Clark, supercharging the computer virus. Together, this computer virus and biological virus co-evolve, symbiotes who cooperate without ever intending to, like the predator that kills the prey that feeds the scavenging pathogen that weakens other prey to make it easier for predators to catch them. MP3
undefined
Nov 25, 2019 • 0sec

Talking about Disney’s 1964 Carousel of Progress with Bleeding Cool: our lost animatronic future

Back in 2007, I wrote a science fiction novella called “The Great Big Beautiful Tomorrrow,” about an immortal, transhuman survivor of an apocalypse whose father is obsessed with preserving artifacts from the fallen civilization, especially the Carousel of Progress, an exhibition that GE commissioned from Disney for the 1964 World’s Fair in New York, which is still operating in Walt Disney World. The novella was collected into my 2011 Outspoken Authors book from PM Press. Bleeding Cool News’s Jason Henderson is a fellow Carousel of Progress obsessive, and he asked me if I’d come on his Castle of Horror podcast to discuss the story, the Carousel, Margaret Thatcher, and optimistic corporate futurism’s increasing absurdity in 2019’s world. The conversation went great (MP3) and ranges over nuclear armageddon, environmental collapse, Epcot Center, cults of personality, Walt Disney’s work-avoidance schemes, Alvin Toffler and the Singularity. The Carousel of Progress is a strange success story. It began its life as an exhibit developed by Walt Disney for General Electric at the 1964 World’s Fair. There the basic structure fell into place: a revolving theater moves around a stationary stage showing four scenes as the audience comes to rest in front of each one. Act One is the turn of the 20th Century, Act 2 is the 1920s, Act 3 is the 1940s, and Act 4 is roughly the present. In each scene, the father of a small family reflects on American life and culture and mentions the latest technological advancements—airplanes, electric fans, cars, “the rat race.” The attraction moved from the World’s Fair to Disneyland and then settled in 1975 at Tomorrowland in Walt Disney World, where it has remained ever since. It has not been updated since 1993. And yet the ride remains strangely compelling and even comforting, a weird mix of futurism and nostalgia. The theme song of the ride is pure optimistic futurism: “It’s a Great Big Beautiful Tomorrow” by reliable Disney songwriters Richard M. and Robert B. Sherman. Doctorow points out, though, that for years this song was traded out for a different song, “The Best Time of Your Life.” As in “this is the best time of your life,” which would have to mean that the future will not be as good. Now the original song is back, but we are listening to an echo—optimism frozen in the past. Cory and Jason talk about futurism, the optimism (and pessimism) of the mid-century, and whether forgotten mid-century fears are a match for modern fears of climate collapse. Along the way, we touch on the strangeness of Tomorrowland and Epcot. Castle Talk: Cory Doctorow on Disney’s Carousel of Progress and Lost Optimism [Jason Henderson/Bleeding Cool]
undefined
Nov 18, 2019 • 0sec

Jeannette Ng Was Right: John W. Campbell Was a Fascist

In my latest podcast (MP3), I read my new Locus column, “Jeannette Ng Was Right: John W. Campbell Was a Fascist,“which revisits Jeannette Ng’s Campbell Awards speech from this summer’s World Science Fiction convention. As far as I know, I’m the only person to have won both awards named for Campbell, which, I think, gives me license to speak on the subject. I think that Ng was absolutely right about Campbell and his legacy, and I think that understanding that the good that people do doesn’t erase the harms they cause (and vice-versa) is critical to navigating a world of flawed people. Here’s the thing: neither one of those facets of Campbell cancels the other one out. Just as it’s not true that any amount of good deeds done for some people can repair the harms he visited on others, it’s also true that none of those harms can­cel out the kindnesses he did for the people he was kind to. Life is not a ledger. Your sins can’t be paid off through good deeds. Your good deeds are not cancelled by your sins. Your sins and your good deeds live alongside one another. They coexist in superposition. You (and I) can (and should) atone for our misdeeds. We can (and should) apologize for them to the people we’ve wronged. We should do those things, not because they will erase our misdeeds, but because the only thing worse than being really wrong is not learning to be better. People are flawed vessels. The circumstances around us – our social norms and institutions – can be structured to bring out our worst natures or our best. We can invite Isaac Asimov to our cons to deliver a lecture on “The Power of Posterior Pinching” in which he would literally advise men on how to grope the women in attendance, or we can create and enforce a Code of Conduct that would bounce anyone, up to and including the con chair and the guest of honor, who tried a stunt like that. MP3
undefined
Nov 3, 2019 • 0sec

Talking with The Storyteller’s Thread about YA literature, activism, and technological rebellion

Séan Connors is a young adult literature researcher at the University of Arkansas, whose podcast, The Storyteller’s Thread, features long-form interviews with young adult writers “on their writing process; on social and political topics that influence their work; on their motivation for writing for young readers: and on other writers and artists whose work challenges and inspires them.” I had the pleasure of recording with Connors on his latest episode (MP3), where we talked about youth activism and YA literature, how I became a writer and then a YA writer, and how schools could do a better job of teaching technical and privacy literacy.
undefined
Oct 29, 2019 • 0sec

Affordances: a new science fiction story that climbs the terrible technology adoption curve

In my latest podcast (MP3), I read my short story “Affordances,” which was commissioned for Slate/ASU’s Future Tense Fiction. it’s a tale exploring my theory of “the shitty technology adoption curve,” in which terrible technological ideas are first imposed on poor and powerless people, and then refined and normalized until they are spread over all the rest of us. The story makes the point by exploring all the people in a facial recognition ecosystem, from low-waged climate refugees who are paid to monitor facial recognition errors in an overseas boiler room, to cops whose facial recognition systems and risk-assessment scoring institutionalize algorithmic racism, to activists whose videos of human rights abuses on the US border are disappeared by copyright enforcement bots deployed by shadowy astroturf organizations, to the executives at the companies who make the facial recognition tools whose decisions are constrained by automated high-speed trading bots. It also explores methods of technological resistance, solidarity, and activism, and how the flip-side of automated systems’ inaccuracy is their fragility. The story is accompanied by a response essay by Nettrice Gaskins (previously), “an artist-educator who collaborates with AI,” who discusses it in the context of the “afrocentric counter-surveillance aesthetic,” which is my new all-time favorite phrase. There were different kinds of anxiety: the anxiety she’d felt when she was recording the people massing for their rush, clammy under the thermal blanket with its layer of retroreflective paint that would confound drones and cameras; she walked among the people, their faces shining, their few things on their backs, their children in their arms, the smell of too many bodies and too much fear. Then there was the anxiety she felt as she retreated to her perch, where she had her long lenses, each attached to its own phone, all recording as the rush formed up, the blankets rustling and rippling and then ripping as bodies burst forth, right into the gas, into the rubber bullets, into the armored bodies that raised their truncheons and swung and swung and swung, while the klaxons blared and the drones took to the sky. There was the anxiety she felt when the skirmish ended and she trained her lenses on the bodies sprawled on the concrete, the toys and bags that had been dropped, the child holding her mother’s limp hand and wailing. But now came a different kind of anxiety as she edited her footage down, mixing it and captioning it, being careful to blur the faces, but being even more careful to avoid any of the anti-extremism trigger-words: migration, violence, race, racism—words that white nationalists used, but also words that were impossible to avoid when discussing the victims of white nationalism. Advertisers hated them, and algorithms couldn’t tell the difference between people being racist and people complaining about racism. There were new euphemisms every week, and new blacklists, too. In theory, she could just hit publish and when the filter blocked her, she could get in line behind tens of millions of other people whose videos had been misclassified by the bots. But she didn’t want to wait 10 months for her video to be sprung from content jail; she wanted it to go viral now. MP3
undefined
Oct 26, 2019 • 0sec

Can we change our politics with science fiction? A conversation with the How Do You Like It So Far podcast

Henry Jenkins (previously) is the preeminent scholar of fandom and culture; Colin Maclay is a communications researcher with a background in tech policy; on the latest episode of their “How Do You Like It So Far” podcast (MP3), we had a long discussion about a theory of change based on political work and science fictional storytelling, in which helping people imagine a better world (or warn them about a worse one) is a springboard to mobilizing political action.
undefined
Oct 22, 2019 • 0sec

Talking science fiction, technological self-determination, inequality and competition with physicist Sean Carroll

Talking science fiction, technological self-determination, inequality and competition with physicist Sean Carroll Sean Carroll is a physicist at JPL and the author of many popular, smart books about physics for a lay audience; his weekly Mindscape podcast is a treasure-trove of incredibly smart, fascinating discussions with people from a wide variety of backgrounds. The latest episode (MP3 is a 1h+ interview with me, on wide-ranging subjects from adversarial interoperability, inequality and market concentration; science fiction and its role in political discourse; and the power and peril of technological self-determination. For those of you who prefer to read, Carroll is kind enough to provide a full transcript. 0:02:52 SC: So here’s an ambitious question to start us off then. We’re clearly not in equilibrium; the internet and the way that we use it is changing rapidly. Do you see us approaching a future internet equilibrium? Even if you can’t say exactly what it is, can you imagine various forms of steady states that we will eventually reach in terms of how we use the internet and how it affects our lives, stuff like that? 0:03:16 CD: I think there’s actually a risk of that. I would not call that a good outcome. As other people have observed, the web has become five websites filled with screenshots from the other four, and that domination of the web by a small number of firms that continues to shrink, and who clearly carve out competitive niches for one another, and occasionally compete with each other, but mostly are content to just sit pat, that has been, I think, a net negative for the internet, and for human thriving, and for things like human rights. And I fear that the path to that becoming permanent is that regulators will observe the dysfunction of a highly concentrated internet, for example, a single social platform with 2.3 billion people on it, whose choices about algorithmic filtering and recommendation drive all kinds of negative outcomes, including people who understand how to game the system to livestream mass shootings in Christchurch. 0:04:16 CD: And that they’ll say to these firms, “Since we can’t imagine any way to make you smaller, and therefore to make your bad decisions less consequential, we will instead insist that you take measures that would traditionally be in the domain of the state, like policing bad speech and bad actions.” And those measures will be so expensive that they will preclude any new entrants to the market. So whatever anticompetitive environment we have now will become permanent. And I call it the constitutional monarchy. It’s where, instead of hoping that we could have a technological democracy, where you have small holders who individually pitch their little corner of the web, and maybe federate with one another to build bigger systems, but that are ultimately powers devolved to the periphery, instead what we say is that the current winners of the technological lottery actually rule with the divine right of kings, and they will be our rulers forever. But in exchange for that, they will suffer themselves to be draped in golden chains by an aristocracy of regulators who are ultimately gonna be drawn from their upper echelons, because when you only have five companies in an industry, the only people who understand them well enough to regulate them are their executives. And so you end up with just a revolving door. 0:05:28 CD: And so the aristocracy will call upon the tech giants to exercise a noblesse oblige, where they will suffer themselves to make certain concessions to the public interest at the expense of their shareholders, but in exchange they will be guaranteed a regulatory environment that precludes anyone ever challenging them. And I think that will be studied, but not for long, because I also think that if we think that Google and Facebook are intransigent today, if we give them a decade without even having to buy potential competitors to prevent them from growing to challenge them, imagine how bullyish and terrible they’ll be in 10 years. 69 | Cory Doctorow on Technology, Monopoly, and the Future of the Internet [Sean Carroll’s Mindscape]

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app