69 | Cory Doctorow on Technology, Monopoly, and the Future of the Internet

Like so many technological innovations, the internet is something that burst on the scene and pervaded human life well before we had time to sit down and think through how something like that should work and how it should be organized. In multiple ways -- as a blogger, activist, fiction writer, and more -- Cory Doctorow has been thinking about how the internet is affecting our lives since the very beginning. He has been especially interested in legal issues surrounding copyright, publishing, and free speech, and recently his attention has turned to broader economic concerns. We talk about how the internet has become largely organized through just a small number of quasi-monopolistic portals, how this affects the ways in which we gather information and decide whether to trust outside sources, and where things might go from here.

Support Mindscape on Patreon.

Cory Doctorow is a science fiction writer, activist, journalist, and blogger. He is a co-editor of the website Boing Boing, and works as a special consultant for the Electronic Frontier Foundation. He is the author of the nonfiction book Information Doesn't Want to Be Free as well as science-fiction works such as Walkaway and Radicalized. He has been awarded an honorary doctorate from the Open University, where he is also a Visiting Professor, as well as being an MIT Media Lab Research Affiliate and a Visiting Professor of Practice at the University of South Carolina’s School of Library and Information Science.

0:00:00 Sean Carroll: Hello everyone, and welcome to the Mindscape Podcast. I'm your host, Sean Carroll. So the internet, you may have heard of it, kind of a big deal. Good chance that you're either listening to this podcast over the internet or somehow used the internet to find out about this podcast or to download it or whatever. It's still a state of flux when it comes to how society deals with not just the internet, but technology more generally: Computers, smartphones, things like that. And one of our sharpest thinkers about this relationship between humanity and the changing world of technology is today's guest, Cory Doctorow.

0:00:37 SC: Cory is best known perhaps as the founder and co-editor of the blog, Boing Boing, but he's also a science fiction writer. His most recent book is called Radicalized, a collection of four novellas. But he's also a very prolific nonfiction writer, and he thinks deeply, not just about technology, but about the law, philosophy, morality, the economics of it all. So we really get into the relationship that people have, not only with technology, but with the corporations, the powers that bring this technology to us, and the ways that they're leveraging our interest to make money for themselves, and the good parts and bad parts of that, and how we can fight against it. Cory talks about these things, both in his essays and books, but also in his stories, which gives him a number of different angles on these important problems. So it's a very fast, multi-faceted, idea-rich conversation, I think you're going to like it.

0:01:33 SC: Remember that you can visit the podcast homepage at preposterousuniverse.com. And if you're interested in listening to Mindscape without ads, you can become a patreon, which is linked from the podcast page. Patreons also get monthly Ask Me Anything episodes; you can ask me a question and I'll do my best to answer it. Of course, regular old episodes always available here for free, whenever you want them. So let's go.

[music]

0:02:15 SC: Cory Doctorow, thanks so much for being on the podcast.

0:02:17 Cory Doctorow: It's my pleasure. Thank you for having me.

0:02:19 SC: You've been around the internet a long time, I don't know, since the very earliest days, or at least since before we knew it as the internet.

0:02:26 CD: Sure, I think my first point of contact was with early networks across universities in the '70s, when my dad started bringing home teletype terminals from the University of Toronto, and then on BBSs that were connected to FidoNet, and through that to Usenet. So not as early as people who helped with the TCP transition for ARPANET, but pretty early.

0:02:52 SC: So here's an ambitious question to start us off then. We're clearly not in equilibrium; the internet and the way that we use it is changing rapidly. Do you see us approaching a future internet equilibrium? Even if you can't say exactly what it is, can you imagine various forms of steady states that we will eventually reach in terms of how we use the internet and how it affects our lives, stuff like that?

0:03:16 CD: I think there's actually a risk of that. I would not call that a good outcome. As other people have observed, the web has become five websites filled with screenshots from the other four, and that domination of the web by a small number of firms that continues to shrink, and who clearly carve out competitive niches for one another, and occasionally compete with each other, but mostly are content to just sit pat, that has been, I think, a net negative for the internet, and for human thriving, and for things like human rights. And I fear that the path to that becoming permanent is that regulators will observe the dysfunction of a highly concentrated internet, for example, a single social platform with 2.3 billion people on it, whose choices about algorithmic filtering and recommendation drive all kinds of negative outcomes, including people who understand how to game the system to livestream mass shootings in Christchurch.

0:04:16 CD: And that they'll say to these firms, "Since we can't imagine any way to make you smaller, and therefore to make your bad decisions less consequential, we will instead insist that you take measures that would traditionally be in the domain of the state, like policing bad speech and bad actions." And those measures will be so expensive that they will preclude any new entrants to the market. So whatever anticompetitive environment we have now will become permanent. And I call it the constitutional monarchy. It's where, instead of hoping that we could have a technological democracy, where you have small holders who individually pitch their little corner of the web, and maybe federate with one another to build bigger systems, but that are ultimately powers devolved to the periphery, instead what we say is that the current winners of the technological lottery actually rule with the divine right of kings, and they will be our rulers forever. But in exchange for that, they will suffer themselves to be draped in golden chains by an aristocracy of regulators who are ultimately gonna be drawn from their upper echelons, because when you only have five companies in an industry, the only people who understand them well enough to regulate them are their executives. And so you end up with just a revolving door.

0:05:28 CD: And so the aristocracy will call upon the tech giants to exercise a noblesse oblige, where they will suffer themselves to make certain concessions to the public interest at the expense of their shareholders, but in exchange they will be guaranteed a regulatory environment that precludes anyone ever challenging them. And I think that will be studied, but not for long, because I also think that if we think that Google and Facebook are intransigent today, if we give them a decade without even having to buy potential competitors to prevent them from growing to challenge them, imagine how bullyish and terrible they'll be in 10 years.

0:06:03 SC: So there is a little bit of a leap there that I'd like to dig into more. Certainly there's this move for the government or the people asking places like Google and Facebook to police themselves a little bit better, to police speech on their platforms, etcetera, Twitter. But then so if we're making that ask, you seem to be implying that that's actually secretly giving some power to them that we should...

0:06:26 CD: Yeah, sure.

0:06:27 SC: Maintain for ourselves.

0:06:28 CD: I mean, Mark Zuckerberg said it himself, he went up and he said, "Please regulate Facebook. Please throw me in that briar patch." Because Zuckerberg knows that any regulation that is made to curb the worst parts of Facebook will not include as a potential remedy getting rid of Facebook. What Facebook and Google perceive, and the other big tech companies including Apple, is an opportunity to becoming de facto state monopolies, comparable to, say, AT&T during the heyday of the Bell System. So there was a long time when the Bell System's dysfunctions were very obvious. The company had engaged in so much anticompetitive behavior that was ultimately bad for innovation, and bad for individual liberty, and the liberty of groups, and the ability of people to coordinate among themselves, and access to telecommunications infrastructure in rural places, and so on.

0:07:25 CD: And yet no one had the appetite to break up AT&T because whenever it was proposed, AT&T would say, "Well, we have been mandated to become a part of the nation's public safety and security. Come disasters, come crises, we are asked to be a kind of quasi-governmental entity. We need the monopoly profits that we get from our abusive practices, from expensive long distance, from insisting that everyone rent their phones month-on-month until they pay for them 100 times over, from being able to decide who can plug what into the system so that we can keep people from having answering machines unless they come from us, and so on. We need those windfall profits to pay for those state-like duties. And it took arguably 30 years longer than it should have to break up AT&T, and unfortunately arrived at the tail end of the antitrust movement, 1982 we finally break them up just as Reagan is dismantling antitrust. So immediately what they do is reconverge so that we end up with not one telco but three, and they're all terrible. [chuckle]

0:08:34 SC: Anyone who has a cell phone knows that they're pretty terrible.

0:08:36 CD: Yeah. And all run by former AT&T executives. They're all spin-offs of AT&T run by former AT&T executives. It's like the pope dividing up the New World. It's nice that the Portuguese and the Spaniards didn't have to fight over it, but I'm sure the people they subjugated weren't happy.

0:08:52 SC: And you think that the present situation with Facebook and Google and Amazon having so much of their respective market shares wouldn't have been possible in the previous antitrust regulation?

0:09:04 CD: Sure. You look at Google, Google's a company that only ever made one and a half products; they made a really good search engine and a pretty good hotmail clone. And then everything else they do is a company they bought, and it's a company that they wouldn't have been permitted to buy pre-Reagan. And it wasn't just Reagan, Reagan kicked it off, but every successive administration has reified and expanded Reagan's antitrust malpractice, up to and including Obama, and certainly Trump is supercharged at greenlighting absurd mergers like T-Mobile-Sprint. But the pre-Reagan era had a pretty widespread prohibition on merging with major competitors, acquiring potential future competitors as they were getting started, or cornering vertical markets. And if you look at Apple, Google, Facebook, Microsoft, it's all they've ever done, it's all... [chuckle] Yahoo's the poster child for this.

0:09:57 CD: And Yahoo's the poster child for what the terminal condition of it is, or one of the terminal conditions, which is that you can raise a ton of capital in the markets, then you can buy up and destroy every promising tech startup for 25 years, cash your investors out several times over, and still end up with nothing to show for it. That's actually the best-case scenario because then the company implodes...

0:10:19 SC: 'Cause they were so bad at it.

0:10:20 CD: Right. The worst-case scenario is they do all of that but continue in a steady state, back to your question about steady states, is that you end up with a permanent Facebook that just goes out and acqui-hires every promising technology company, destroys anything good about what they were making, takes the parts that can be used to enhance its monopoly and rent-seeking activity, and integrates them into its product lines. And we end up with a series of nested walled gardens, which is what we have now, like last year Facebook lost the largest number of American users in its history, it lost 13,000,000 13-30... Oh no, 15,000,000 13-34-year-olds left Facebook last year, but the majority of them ended up on Instagram, [chuckle] which is a Facebook subsidiary.

0:11:02 SC: Which is owned by Facebook, yes, that's right.

0:11:07 CD: You can imagine this just going on forever, walled gardens within walled gardens, where you escape one and end... Only to end up in the next one over.

0:11:16 SC: So what would you like an equilibrium to be, if you had to put up with an equilibrium?

0:11:21 CD: So I would like for there to be a stable set of what for want of better terms we might call constitutional principles for federation. So if you think about US federalism as maybe an example... And it's obviously not without its dysfunctions, see the electoral college and so on. That what we have are individual, autonomous, small regulatory units in the form of states or mid-sized regulatory units in the form of states; sometimes they get very big, we're sitting here in California, but smaller than the nation. And then we have a set of governing principles that dictate with a minimum set of personal freedoms those regulators have to give to the people who are under their... Within their remit. And first among them is the freedom to go somewhere else. And if you could imagine that we would have a set of rules about distributing malicious software, denial of service attacks, certain kinds of incitements to violence, and discriminatory conduct related to protected categories of identity, including race and gender and so on, and that people who agree to adhere to those federate with other people who agree to adhere to those. Well, what they do within that federation is incredibly variable.

0:12:49 CD: So I know a woman, a friend of mine, who writes comic books, and her comic books are really cool, really smart on gender, kind of feminist-inflected superhero comics for mainstream comics publishers. And she is the target of harassment by a small group of really terrible men on Twitter who have a method for gaming Twitter's anti-harassment policies. And what they do is they send you just revolting, threatening direct messages, which they delete as soon as you read them, because Twitter won't accept screenshots 'cause they're so easy to doctor. And Twitter, to its credit, when you actually delete something, Twitter can't readily access it; it's pretty much deleted. Maybe there's a backup somewhere, but you can't say to Twitter, "Go find that DM that was in my messages two hours ago," 'cause it's gone, they can't see it. Which is good, you want that. You want... If you and your group of Hong Kong protesters are planning a protest, you don't want the politburo to be able to order Twitter to turn 'em over.

0:13:48 SC: You want to be able to delete things, that should be a right, yeah.

0:13:50 CD: Yeah. Deletion's good.

0:13:52 SC: Although if I receive a DM, maybe I should be able to keep a copy of it. [chuckle]

0:14:00 CD: What if you've been arrested by the Chinese state and you want... And the person who sent you the message wants to delete the incriminating side of it?

0:14:09 SC: Yeah, ideally I should be able to delete it, but maybe the Chinese police wouldn't let me do that. Yeah, that's fair.

0:14:13 CD: Yeah. So you get arrested, your phone is now in the hands of the Hong Kong Security Services. I have sent... You our privy to a thread of messages that expose 50 organizers that you work with. They delete the messages. You want it to disappear from your phone. It's different threat models and different use cases, and actually we're gonna come to that in terms of what could be done. So then these men, what they do is they send messages in your public Twitter stream that reference this threatening message, but without the threatening message as a key they do not seem threatening in and of themselves. And so what they can do is continually harass you but without ever giving rise to an offense that would have them kicked off Twitter.

0:14:54 CD: Now imagine that Twitter could not avail itself of any legal tools to prevent a third party from making an interoperable Twitter service, a rival to Twitter, like a Mastodon instance that you could use to both read and write Twitter. And my friend and her dozen friends who are targeted by these 100 men could make a Twitter-alike that they could use to be part of the public discourse that takes place on Twitter but that would have a rule that all of these shitty men were blocked, and that would allow you to recover DMs for the purposes of dealing with this kind of harassment, and so on. You could now have a broad set of... Broad latitude within a decentralized, local means of communicating with everybody else that would allow people who were targeted for harassment to deal with this one-in-a-million use case, 'cause remember Facebook's got 2.3 billion users. That means they've got 2300 one-in-a-million use cases every day.

0:15:49 CD: So it would allow you to deal with this one-in-a-million use case that is inconceivable, but big firms would be able to deal with, even if they had the political will, which they don't. And you could still, on that periphery, be part of this larger conversation. You wouldn't have to opt out of being in the world just to be protected from grotesque harassment. That would allow you to have a use case in which you say, "I want the evidentiary Twitter, where DMs are not deleted. I want to be able to retain them even if my counterparty doesn't." And you could opt into being in that Twitter, and Twitter wouldn't have... You wouldn't have to wait for Twitter to adjudicate your case to decide that that's what you want.

0:16:31 SC: So imagine there's 20 different clones of Twitter, each with different rules, and also interoperability as part of our future utopia equilibrium?

0:16:39 CD: Yeah, so that you would have not just interoperability through standardization, which you might call voluntary interoperability, and not just interoperability through indifference, what you might call indifferent interoperability; the people who make your car don't care where you plug in the cigarette lighter. Apple really wants everyone who makes an app to follow a set of interoperable terms, so they've got voluntary interoperability or enthusiastic interoperability. Your car manufacturer's got a different interoperability. Those are both good and really important. Standards fall into that category and so on. I'm talking about adversarial interoperability. That's when I plug something in that you really don't want plugged in because it helps me improve my life or the life of my customers or users. And so that would be like third-party printer ink.

0:17:20 CD: So imagine if Twitter could avail itself of technical countermeasures to try to block people who scrape their waiting Twitter messages and move them into a Twitter-alike service with their own rules, but they couldn't sue over patent infringement, they couldn't sue for violating terms of service, they couldn't sue under the Digital Millennium Copyright Act and its prohibition on circumvention, they couldn't bring a tortious interference claim. That you would have blanket immunity for all legal theories for any activity that creates interoperability that allows users to have more control over the technologies that they use.

0:17:53 SC: And I get the impression that this is not what these companies want. Facebook used to allow me to just forward my tweets and post them on Facebook, but they cut off that ability. I presume it's because they don't want too much interoperability.

0:18:07 CD: Facebook went further than that. Facebook, when it launched, had a tool that would log into Myspace, pretend to be you, use your login and password, get you're waiting Myspace messages, put them in your Facebook inbox, and let you reply to them, because that's how they solved the collective action problem of, "Well, Facebook is better than Myspace, but my friends are on Myspace. I can't leave Myspace until my friends are on Facebook. They won't leave Myspace until I'm on Facebook. We're all stuck here within Myspace." They just let you have one foot on either side of the walled garden. Now, Facebook, having done that, sued a competitor...

0:18:37 SC: Of course.

0:18:37 CD: Called Power Ventures that made exactly the same tool that would allow you to read your Facebook messages and your LinkedIn messages and your Twitter messages all within one interface and get you away from Facebook being able to observe everything you do, spy on you while you're doing it, and then use that information to target ads and so on.

0:18:54 SC: What was the grounds for the lawsuit?

0:18:55 CD: It's the Computer Fraud and Abuse Act. So the Computer Fraud and Abuse Act was passed in 1986 after Ronald Reagan watched the movie WarGames and panicked.

[laughter]

0:19:03 CD: The federal prosecutors have been champing at the bit to have a more expansive definition of cyber crime and hacking. They arrived at a definition that was so expansive that it's effectual, effectively without limits. They said that any time you exceed your authorization on a system that does not belong to you that you potentially commit a felony, right? Now, that sounds like a reasonable thing on its face. If you work at the video store and your boss gives you the right to check in and out videos but doesn't give you the right to go back and get the home addresses of the cute customers and go stalk them at home, then you've exceeded your authorization. But today, authorization is terms of service. It's that sprawling novella of garbage legalese that no one's ever read and it boils down to like by being dumb enough to use the service, you agree that we're allowed to come over to your house and punch your grandmother and wear your underwear and make long distance calls.

0:19:54 SC: Right.

0:19:54 CD: Right? And so, Facebook argued that Power Ventures had violated its terms of service and in so doing had violated the Computer Fraud and Abuse Act. They built on a decision that Blizzard had gotten, the World of Warcraft people, over an interoperable game server called BnetD. Now, that law is being eroded or that precedent is being eroded. LinkedIn just lost a court challenge against a competitor called Haiku that scrapes LinkedIn, publicly-available LinkedIn data, to create analytics for employers. And that was fine with LinkedIn for a while and then LinkedIn launched its own analytics product and shut down Haiku and sued them, and the court not only found that Haiku was not violating the Computer Fraud and Abuse Act, they actually ordered LinkedIn to not take technical countermeasures to shut down Haiku, to allow Haiku to continue to scrape the service.

0:20:46 CD: So that legal theory, that exotic legal theory, which was very expensive and hard-won for Facebook, chilled a generation of technologists, right? This is why venture capitalists call Facebook's business, which grows year-on-year by double digits and is generating billions of dollars in profits, they call that the kill zone, right? Under normal market conditions, you would expect that if you had double digit year-on-year growth for a multi-billion dollar business that people would be very excited about figuring out how to take your 1,000% margin and offer the same service at a 500% margin.

0:21:20 CD: But no, right? They, the Facebook stands alone. No one will fund you to compete with Facebook. The last company that really tried in earnest was Snap, and Facebook used an acquisition of theirs, an acquisition that would have been illegal prior to Reagan, a company called Onavo, that made a deceptive battery monitor that actually monitored everything you did on your phone and sent that telemetry to Facebook to discover that Facebook's users were downloading and installing Snap and to then gather in fine detail how Snap was being used. And Facebook used that to both buy and tweak Instagram to become a direct Snap competitor, right?

0:21:58 SC: I'm trying to think if I had this battery monitor. When was this happening?

0:22:01 CD: It was about seven years ago.

0:22:03 SC: Okay.

0:22:03 CD: Onavo then morphed into a surveilling VPN, which is like the most ironic piece of product, but it's a VPN that keeps what you do private from your ISP but spies on everything you do and sends it to Facebook.

0:22:14 SC: So there's examples of adversarial interoperability just because the company really tried to do it. Like Apple made software where you could read Microsoft Word documents. But you seem to be suggesting that there's also legal issues here that... I mean is it true that if Facebook just tried hard enough on the tech side, they could prevent this kind of interoperability?

0:22:34 CD: Well, I mean that's, I think, a thing that Facebook would say. They would say, "You don't need to take away our legal defenses. We have a monopoly on the smartest technologists and so we will always win." But that's not what they did, right? They didn't just shut down Power Ventures, they sued Power Ventures. It's not like lawyers are cheaper than programmers. Maybe they just wanted to make an example, but I'm inclined to think that the legal issues are the real chilling effect. You know it's a truism in information security that defenders have a harder job than attackers. If you are building a wall around your castle, that wall needs to be perfect. If I wanna knock it down, I need to find one mistake you've made, right?

0:23:15 CD: And so, that asymmetry makes life very, very hard for defenders. And the fact that Facebook is supporting 2300 one-in-a-million use cases every day means that detecting bots and distinguishing them from users who are just doing things that are weird is very hard. The total scope of what passes for normal user activity among Facebook's 2.3 billion user pool is so broad that there's just plenty of latitude to make scrapers that look like a user somewhere.

0:23:49 SC: I wonder if a lot of people on the legal side have the example in mind of Internet Explorer where it was thought to be a monopoly and there were a lot of legal maneuvers to sort of break up that monopoly, but in retrospect, it kind of wasn't because it was very easy to replace it with someone else. Do a lot of people think that Facebook will just be replaced organically by better competition or is this truly a different situation?

0:24:12 CD: So, it's an interesting question about what happened with Explorer. Explorer was at the heart of an antitrust lawsuit that ultimately did not succeed. But Microsoft insiders say that what happened through the antitrust enforcement action was that it changed the internal calculus inside the board rooms at Microsoft, that Microsoft, like every other big institution, has a spectrum of attitudes and ideas about products and the best way to launch them into the market. But historically, people who said, "We should do things that would be illegal under antitrust law," always won the argument because every time Microsoft did that, they gained market share.

0:24:49 SC: And they won, yeah.

0:24:49 CD: And then you have the antitrust action. And the antitrust action against Microsoft is really interesting 'cause it was the first one where the depositions were video-recorded and released to the public, and Bill Gates, when he was deposed, lost it. So he is clearly... He's on the spectrum, and his spectrum behaviors became very, very obvious during that deposition in a way that we had not seen in Bill Gates' public appearances. He starts rocking, he starts stimming, and then when he's talking to them, he's very belligerent and very unsympathetic, right? And there is a story from Microsoft insiders that says that, after that, every time anyone proposed doing something really terrible, someone in the boardroom would say, "Don't make them put Bill back on the stand." Right?

0:25:39 CD: It is often credited with the growth of Google, right? How did Google manage to avoid the kind of anti-competitive behavior that Microsoft was happy to engage in with every other competitor? Well, it might have been that they were stayed by what you might call the policemen inside, right, the internalized belief that if you did this, that you would face terrible consequences.

0:26:02 CD: Certainly that seems to have been the case with IBM. So in '81 or '82 and when IBM was creating the first PCs, its first PCs, they did a bunch of things that were uncommon for IBM. So one was that they used commodity components, and this is a company that had been known to make its own machine screws that were proprietary for their mainframes. They also did not sue Phoenix, the company that reverse engineered their ROMs and reimplemented them so that Compaq and every other PC manufacturer can make PC clones. Now, that year was also the year that the DOJ was finally stepping back from a 12-year antitrust enforcement action against them over mainframes that only ended because mainframes ceased to be a line of business for IBM. It's getting them for... Antitrust over mainframes would be like...

0:26:48 SC: Have that monopoly. Go ahead.

0:26:49 CD: Yeah. You got a perch on monopoly is safe, right?

0:26:52 SC: Yeah.

0:26:54 CD: And again, there are lots of people who say that the reason that IBM did these extraordinary things is that the people in the boardroom who had historically said, "Let's not use bullying tactics against our competitors. Let's use commodity components because it allows us to iterate faster, and even though we don't control the whole supply chain, we can do more." That those people started to win the arguments after 12 years of being batted around up and down the 12 miles of dirt road by the DOJ, finally they were starting to win the arguments.

0:27:29 SC: Is there a problem that people seem to maybe like living in walled gardens or even they like sort of handing over some of their property, their rights to use their own property to these major corporations like with phones and TV services and so forth?

0:27:45 CD: So if that were the case, we wouldn't need countermeasures, right?

0:27:47 SC: Yeah.

0:27:47 CD: If people loved paying extra for ink because they knew it was reliable, then printer manufacturers wouldn't have to take all those countermeasures to prevent third-party ink, and if people loved the convenience of letting Apple decide which apps they could use, then Apple wouldn't need to take so many countermeasures to prevent third-party app stores. The reality is that locking users in, creates a kind of moral hazard, right? It takes all the things that are good about your product that your customers actually like, like the reliability of your ink or the fact that the apps are vetted and have high quality and invites you to abuse that trust because you know that if you abuse the trust once they've kind of wandered into the walled garden, that there's no way for them to use markets to punish you. And so you see the monotonic rationing up of printer ink and of incredibly restrictive terms over printers. So you know HP now makes printers where you only rent the ink and so you subscribe.

0:28:43 SC: Like really you rent the ink?

0:28:45 CD: Yeah. You subscribe to a certain number of pages per month, and it doesn't matter how much ink is in your printer. If you've exceeded your page budget, your printer won't print, right? Hasbro just did this with its new Nerf gun. They have a Nerf gun that won't fire third-party darts and it uses technical countermeasures to prevent you from firing third-party darts. If people loved Hasbro darts, those countermeasures would be superfluous.

0:29:07 SC: I think of the K-Cup coffee makers as the quintessence of this, right.

0:29:10 CD: Yeah, sure.

0:29:10 SC: Like you buy the machine, but then you're on hook for the rest of your life to those little cups.

0:29:13 CD: Yeah. Yeah. And if it's true that that's what people want, you wouldn't need a patent and you wouldn't need copyright enforcement. You wouldn't need terms of service. You could just rely on your customer's incredible adoration for your product.

0:29:29 SC: But people do seem to like Facebook, right, or at least most?

0:29:33 CD: 13 million people left it last year.

0:29:35 SC: Yeah, that's true.

0:29:35 CD: Facebook has at least as many hostages as it has users.

0:29:39 SC: Yeah.

0:29:39 CD: Right? Again, like I think that they're planning...

0:29:42 SC: By the way, this is totally me being devil's advocate.

0:29:43 CD: No, I understand.

0:29:44 SC: I basically left Facebook because I find it repellent.

0:29:46 CD: Yeah. Yeah, I'm a Zucker vegan.

0:29:48 SC: But there's a lot of people there.

0:29:49 CD: I don't use any Facebook products, you know? I think that Facebook does have this incredible advantage that is separate from its monopoly advantage, which is the network effect. I am skeptical of network effects as the kind of central explanation for how we ended up with these monopolies, but the network effect of like everyone on your kids' little league team is using Facebook to organize games means that you either take your kid out of little league or you get a Facebook account. That clearly works to Facebook's favor.

0:30:22 CD: The monopoly part is that it's not hard to imagine a third-party service that would allow you to monitor the things going on with the little league game while not having to have a Facebook account. Facebook, it's very interesting in the wake of Cambridge Analytica, has used the moral panic over Cambridge Analytica to militate against any service that would allow you to do that because they say, "Well, if you can monitor the little league game, maybe you could evade our anti-harassment or anti-data mining or anti-political manipulation tools, you should... " And in fact, Zuckerberg in these leak to audio recordings that were just came out this week as we're recording this from his internal meetings, he says that Elizabeth Warren is misguided in wanting to break up Facebook because only someone with the kind of resources that Facebook has could prevent political manipulation.

0:31:17 SC: It's a benevolent dictatorship, yes.

[laughter]

0:31:19 CD: Well, and it's hilarious that Zuckerberg's answer to his catastrophic failure to prevent political manipulation is to entrust him and deputize him to prevent political manipulation, and talk about fool me twice, shame on me.

0:31:33 SC: And I love the prevalence these days of when you're trying to sign up for a new service, they give you the option of logging in with Google or logging in with Facebook. And I think that I was trying to do one just the other day that those are the only two options.

0:31:46 CD: Right.

0:31:46 SC: Like I couldn't create an account on the service, and that just seems to be handing myself over to these big companies, but... I mean, this is why I keep pressing on this because the convenience of it must be attractive to a large fraction of people even if there's others who don't like it. You know they just live on Facebook and let Facebook handle everything.

0:32:03 CD: Yeah, but there's... I think you've got an "or" where you want an "and," right? You could have Facebook manage all your logins.

0:32:10 SC: Yeah.

0:32:11 CD: And, you could have the ability to take all that stuff outside of Facebook and put it somewhere else if you decided that Facebook wasn't a good steward of your stuff, and if that were the case, right, if Facebook could not prevent that, then they would be incentivized to be more respectful of your privacy and your attention and your data, right, that as our conservative friends like to remind us, incentives matter. Facebook has no incentive to treat you with dignity because they know that you're stuck. It's like that old Lily Tomlin sketch about the phone company: We don't have to care. We're the phone company, right?

[chuckle]

0:32:47 SC: Yeah.

0:32:47 CD: Facebook doesn't have to care. Even if you leave Facebook, you will become an Instagram user. Why would Facebook ever care?

0:32:54 SC: And it sounds like the only way to really have change here is through legal means, or... I mean, it's not going to be small actions by groups of users complaining, right?

0:33:03 CD: So I think that Larry Lessig's framework for change really works here, that the four drivers of change are code, what's technologically possible, law, what's lawfully permissible, norms, what's socially acceptable, and markets, what's profitable. And clearly, competing with Facebook could be profitable, that under normal circumstances, markets would actually go a long way to correcting the worst excesses of Facebook, but there are legal impediments to those corrections. Now, the fact that there are businesses like Snap that are really pissed off at Facebook and like Yelp that are really pissed off at Google over their anti-competitive behavior means that there is a commercial constituency to push for legal reform. So it's not just consumer rights groups that are doing this. Now, I don't think that Snap or Yelp are gonna be any better stewards of their power than Facebook is. I'm not a tech exceptionalist, right? I don't think that tech executives are either so virtuous that they can't be regulated or shouldn't be regulated or so venal that they will always be wicked, I think that they are everyday sociopaths, and incentives matters.

0:34:13 CD: And if you let them get away with murder, then they'll have blood on their hands. So I think that although we have these commercial constituencies that will militate for legal reform that are... The eyes on the prize should be about a much more pluralistic world where you have lots and lots of people doing it, and that's where things like technology come in, 'cause one of the things that we know is that whole products can be replaced with small scripts, that a lot of times, users have taken something that started off as something a network administrator would do to automate their most boring parts of their job and just turn it into a thing that they do all day long, and just make stuff happen. And so users, given access to easy technology that tools-ness can make, that the laws don't threaten, that people have an understanding of the need for because they've had a normative shift, that we can affect gross changes in user behavior. The risk is to just put this all down to individual choice, right?

0:35:14 SC: Yeah.

0:35:14 CD: It's the same as with climate change, that the reason we have climate change is not because of your lack of recycling diligence, and the reason we have Facebook is not because you individually didn't choose to leave Facebook, that these big social factors that are at play are what has created this dominance for Facebook. It's what's created the social crisis, the climate crisis, and averting it does require individual action, but that individual action is to join groups and to use those groups to affect social scale change, that it's not gonna happen in your blue box. It's gonna happen at your city hall meeting, at the ballot box, in the streets. That's where the change is gonna come from.

0:36:01 SC: And you've written about this stuff in numerous venues and genres, and one of them I need to mention is science fiction, and you have a new book out called Radicalized, and my favorite story, and it was... Was it Unauthorized Bread?

0:36:14 CD: Unauthorized Bread, yeah.

0:36:15 SC: Unauthorized Bread, which deals directly with this idea of the company having the right to let you do things. Why don't you tell the audience a little bit?

0:36:22 CD: Sure. Yeah, Unauthorized Bread, it's a story about people in refugee housing that has been created as part of a variance to a developer, a property developer who wanted to build a luxury building and wanted planning permission to add extra stories to it over the limit, and they said, "Yes, you can, but some of those stories have to be subsidized. They have to be below market rent." And this happens all the time, and just as with those buildings, there's a poor door and different lobbies for the poor people...

0:36:46 SC: Yeah, certainly here in LA, we get a lot of this.

0:36:48 CD: Yeah, and New York, London, all over the place. So you don't go through the marble lobby with the doorman, you go literally around the back where the garbage cans are. But they go further. The elevators won't stop for them unless there's no one from the above market rent side or the market rent side that wants to use it, and then every appliance in the poor floors is designed to extract revenue from those people, that the dishwasher only washes authorized dishes and the laundry machines only wash authorized clothes and the toasters only toast authorized bread, and this is this extractive program that we see already in lots of places.

0:37:25 SC: It's not that much of an extrapolation.

0:37:27 CD: Yeah, the poorer you are, the worse the technology treats you. From the subprime credit card you have, which has been tuned with algorithms to maximize the amount of penalties that you end up paying, all the way down to your subsidized cellphone, which will cost you more, and come with more shovelware and spyware than the full market phone that you buy. It's not always true that if you're not paying for the product you're the product, you can also be the product if you're paying for it, [chuckle] but you're more of the product if you're not paying for the product.

0:38:00 CD: And this is bad enough, but then the kinds of hedge funds that like to back this kind of business are also the kinds of financial engineers who periodically take their business through a bracing structural bankruptcy for the purpose of shedding debt and restructuring, and that means that all the servers stop working one day, and then that means the appliances don't work. And this incentivizes the protagonist of this story, a Libyan refugee named Salima, to look up how to jailbreak her appliances. And the experience is so fulfilling...

0:38:27 SC: So that she can toast her bread.

0:38:28 CD: So she can toast her bread. [chuckle] But then because it feels good, because seizing the means of computation just feels right, she teaches other people how to do it, and it kind of sweeps the building like wildfire. And the story takes a turn when these companies start to restructure to bankruptcy. And thanks to the Digital Millennium Copyright Act in Section 1201 of it, which makes it a felony to bypass copyright locks, they're now all facing felony prosecution. And if you're a refugee, that can mean losing your refugee status, and that can mean death. And so suddenly the stakes go from which bread can I toast in my toaster to life or death.

0:39:05 SC: And how do you find the effectiveness both in getting the message out, but also just for your personal pleasure in writing that in the science fiction vein versus just a straightforward article or book?

0:39:16 CD: Well, you know, I do both, and I think both are necessary.

0:39:18 SC: Right, so you can compare in a way that most people can't, yeah.

0:39:20 CD: Yeah, and both are necessary. But I have a theory of change I call peak indifference, which is that when you have a real problem, but where that problem's cause and effect are separated by a lot of time and space, sometimes it's hard to know whether there is a real problem or what you should do about it, whether that's smoking and cancer, or climate change, or making bad technological decisions. And if you neglect that problem, then over time it will create mounting debt. You'll get sicker from the cigarettes, that your house will burn down or be flooded out, your technology will start to abuse you in more grotesque and gratuitous ways. And so after a while, the job of an activist moves from convincing people that there is a problem to convincing them that it's not too late to do something about it, because denialism can slide into nihilism in just a hot second. You can go from, "Eh, I'm sure the rhinos will be fine," to "Alright, there's only one of them left, but since there's only one left, I might as well find out what he tastes like."

0:40:22 SC: And that's right now for climate change.

0:40:23 CD: Sure.

0:40:24 SC: That's exactly what we're hearing right now.

0:40:25 CD: Yep, absolutely, you see this...

0:40:27 SC: The same people who said there never was a problem said you can't... Are now saying, "You can't possibly solve the problem."

0:40:30 CD: Or on the right, you have the rise of eco-fascism, which is an ideology expressed by the Christchurch killer, who says, "Oh no, climate change is real. That means that we have to get rid of half the human race, and it should be the brown people." And that's a...

0:40:46 SC: Surprise! [chuckle]

0:40:46 CD: That's an old ideology. Hardin, who wrote the tragedy of the commons, was an eco-fascist. And the Sierra Club's early founders were... Dabbled in eco-fascism, and so on, but it's come roaring back, it's become a major motif in our society right now, especially among the far right.

0:41:07 CD: So the thing about peak indifference is that the sooner you can reach that moment, the sooner you can reach the moment where a critical mass of people acknowledge the existence of the problem, the easier it is to avert nihilism because the more wiggle room you have to take action, to improve things, the less debt, policy debt you've incurred, and the less obvious policy bankruptcy seems, the less inevitable the policy bankruptcy seems. And fiction is a really good way to make people vividly imagine the consequences of long-off activities. Nonfiction can do it, the Silent Spring is obviously an example that often gets cited. But if you think about the role that, say, 1984 played in our privacy debates for 50 years, where the abstract question of privacy... Well, if there were cameras everywhere, I would feel different, and some of my activities would be chilled, is a very bloodless argument, but that as an Orwellian notion imports by reference the entire visceral experience of reading 1984, although it takes that... The dry skeleton of the argument and puts blood and flesh and muscle on that skeleton and makes it very, very literally visceral.

0:42:27 SC: And maybe reach different audiences also.

0:42:28 CD: Sure.

0:42:29 SC: It's absolutely one of the issues in the modern age that different audiences are getting their information from different places. And one of the interesting things about climate change, for example, is how it's become this tribal marker, especially within the United States. There's a lot of conserve... My understanding is there's a lot of conservative parties worldwide who's like, "Well, of course the climate is changing. Why would we deny that?" But in the US it's become something where if you're on this side of other political debates you have to deny climate change.

0:42:55 CD: Yeah, and I think that the fragmentation of our beliefs is important but overrated compared to the fragmentation of our epistemology, of how we know what to believe in. That in a complicated technical society, we long ago had to put away the idea that you would just ask a trusted person what was true, and instead we have trusted processes. That there are reasons that people of good will might disagree about the technical answer to hard questions like what should... What food preparation techniques will allow you to eat your dinner without dropping dead before breakfast, or which pharmaceutical products are safe to use and under what circumstances, or is the reinforced steel joist that's holding up the roof over our head of sufficient strength and flexibility to keep us all from dying from the roof falling on our head?

0:43:52 SC: Or did the manufacture of this jumbo jet actually put in the right safety mechanisms?

0:43:56 CD: Right, right! Or is the tailpipe of your car pumping out so much NOx that it's gonna kill you? All of those things are questions that we can't hope to navigate individually, you... Even if you had the media literacy to know which scientific journals are trustworthy and which ones aren't, and the statistical literacy to evaluate studies to see whether they were performed well, you wouldn't have the domain expertise to then actually look at the technical particulars of all of those studies to evaluate them. But we have a process, we have truth-seeking exercises where independent adjudicators hear from multiple experts, they listen to the competing claims, they explain their reasoning when they come to a conclusion, they are bound by strict ethical guidelines about how they can be related to the parties whose claims they're hearing, and there is a process for appeal if new facts come to light or if the process was revealed to have had flaws.

0:44:54 CD: But that process has become increasingly fraught. The ability of truth-seeking to actually look for the truth is now cabined by the extent to which the truth gores a billionaires ox. And so truth-seeking has become something of an auction, and that is really problematic. You alluded to Boeing and the 737 MAX. The 737 MAX was a decision by an expert body that Boeing could self-regulate certain elements of its safety features. That was wrong on its face; it should have been obvious that that was wrong. The reason they came to that conclusion...

0:45:28 SC: The incentives matter. [chuckle]

0:45:29 CD: Right. And the reason they came to that conclusion is because aerospace is super concentrated, and Boeing has been self-regulating for a very long time because all of the regulators are drawn from its executive ranks or the ranks of its direct competitors, who are... When there's only five companies left in an industry, all the executives in each company used to work at the other ones. [chuckle] So it's... You really end up with just one company. So you look at the FCC, the good FCC chairman we had under Obama, Tom Wheeler, was a Comcast lobbyist, and the bad FCC chairman we have under Trump is... Ajit Pai, is a Verizon lawyer. When truth seeking becomes an auction, you are cast into an epistemological void where literally you could die because you don't know what's true and what isn't, pharma being a really good example.

0:46:20 CD: People who don't believe in vaccinations, I think, are wrong, but the story they tell of why they shouldn't trust vaccines is right. They say the pharmaceutical industry is super concentrated, it's run by financialized management elites who don't care if they kill the people who take their products, and the regulators who are supposed to regulate them actually let them get away with murder. And as Exhibit A, I would cite the opioid epidemic. And understanding why claims that the conclusions that our truth-seeking exercises have come to about vaccines are true, and the conclusions that they came to about opioids were false because the reason we have the opioid epidemic is not just because of the Sacklers and Purdue Pharma, it's because the NIH and the FDA allowed tainted evidence to produce guidance about the safety of opioids that was wrong and should have been understood to be wrong.

0:47:12 CD: It wasn't even a particularly good forgery; the majority of it all cited back to a five-sentence letter sent by a Boston University doctor, Dr. Jick, to the New England Journal of Medicine in the 1980s, where he observed in a qualitative but not quantitative way that the patients in his hospital were not becoming addicted to opioids when used for pain relief at the rate predicted by the literature as it existed. And it was literally just a letter, not a study. It became the most cited thing that the New England Journal of Medicine ever published, they call it the five most consequential sentences in the history of the New England Journal of Medicine.

0:47:47 SC: Wow, I did not know about this, yeah.

0:47:48 CD: It should have been obvious. It should have been obvious to anyone whose job it was to keep the pharmaceutical companies honest that this was not about what was healthy or safe, it was about what was profitable. And so you can't really fully fault vaccine denial, because the shape of the vaccine conspiracy, the alleged vaccine conspiracy, is the shape of a real conspiracy, the opioid conspiracy.

0:48:16 SC: And sometimes there are conspiracies, right, yeah.

0:48:18 CD: Increasingly there are conspiracies. So there's a widespread belief among some African-Americans that in Katrina, the reason that the black parishes were flooded was that the levees were dynamited to spare the white neighborhoods. I don't think that happened; it seems that that didn't happen. However, in the '50s, in Tupelo, Mississippi, they dynamited the levees to flood the black neighborhoods and spare the white neighborhoods. And so in the absence of confirming or disconfirming evidence, the hypothesis that maybe this happened... Once is accident, twice is coincidence, three times is enemy action. If it's happened a bunch of times, it's not unreasonable to think maybe it happened again. You scratch a Ufologist, you find someone who knows chapter and verse about real military and aerospace cover-ups. Now, I don't think that Area 51 is stuffed full of aliens, [chuckle] but I do think that there are military cover-ups.

0:49:20 CD: And so what we end up falling back on rather than does the truth-seeking exercise think it's true or not, is this heuristic of does someone who says things that have turned out to be correct tell me that there is a conspiracy afoot. And if so, I guess I'll just trust them based on whatever they say. Donald Trump tells you the system is rigged, the system is rigged. And then when he tells you that climate change is Chinese hoax, he is credible because he was the only politician during the debates who would say the system was rigged, and that part was true. He was gonna rig it more, [chuckle] but he wasn't lying about that. And I think that this epistemological crisis, our inability to know whether something is true as opposed to what we believe in, that this, more than anything, is responsible for the fragmentation in our beliefs. And we focus too much on the fire and not nearly enough on the kindling piled up all around the base of the trees.

0:50:27 SC: But where does it fundamentally come from? What is the difference between now and 50 years ago in this, is it just because the ways that we got our information is different, or is it the...

0:50:34 CD: No, I think it's monopolies, and I think it's inequality. I think it actually has the same root as what's going on on the internet, that if you wanna look at it through Thomas Piketty's lens in the Capital in the Twenty-First Century, you have a period through most of the world in which the share of wealth owned by the richest was very high, and the share of wealth owned by everyone else is very low. There were a couple of events that ended up rebalancing that temporarily, the French Revolution was one. Manumission in America was another one, the majority of wealth on America's national balance sheet was in enslaved Africans. The conversion of enslaved Africans in law from assets to humans wiped a substantial fraction of the gross national wealth off the books. And since the majority of that wealth was held in a few hands, you ended up with a much more equal civilization and much more equal nation, one of the most equal in the history of the world, and that what happens after that is a period of enormous growth and pluralism. It's by no means perfect; reconstruction, Jim Crow, are obviously not pluralistic policies. But the policies overall compared to the pre-Civil War policies are much more pluralistic and inclusive, and that's in part because the rich people didn't have as much money, and so they couldn't spare as much to affect policy outcomes.

0:52:02 CD: And the other thing Piketty says is that over time wealth concentrates. That even in fast-growing, dynamic economies, everything else being equal, rich people get richer. And he has this great parable where he contrasts the fortunes of Bill Gates, Microsoft founder, with Liliane Bettencourt, the heiress of the L'Oreal fortune, richest woman on earth, who's never done a day's work in her life, with Bill Gates, investor. So after Bill Gates quit Microsoft, he had an equally long career as a financial plumber moving money around as opposed to making things that people needed. And Bill Gates' fortune over the period in which he founded this most successful company in the history of the world grew more slowly than Liliane Bettencourt's fortune. So the new money in Liliane Bettencourt's bank account over the same period, as someone who did nothing but owned assets, were larger than Bill Gates' fortune. But Bill Gates' fortune as an investor grew by more than either. So just being someone who owns things will always, all other things being equal, make you richer even than people who do things that make all of our lives better.

0:53:08 CD: And so over time, wealth begins to concentrate. So you have a bunch of events that make wealth more equal, and then wealth concentrates and concentrates. And one of the things about concentrated wealth is it's intrinsically unstable because you can't follow policy to places that are good for society, instead you have to follow policy that creates parochial benefit for rich people, and that over time, that policy debt manifests in political instability, which creates revolutions or wars or other kinds of crises or catastrophes. In Piketty's view, the big one was World War I, the inter-war period, and World War II, which was the largest capital destruction in the history of the world because the capital prior to that had been held in so few hands per force, the majority capital destroyed belonged to rich people.

0:53:52 SC: Rich people, yeah.

0:53:52 CD: 'Cause they were the only people who had any money. And then you have what the French called the 30 glorious years, les temps glorieuses, the creation of the welfare state, the most prosperous period in human history. And over time, you also have an accumulation of wealth, the rich are getting richer. By the mid-'70s, the share of wealth owned by the top decile has reached a tipping point, and you start to see the projects that elect Reagan, that dismantle progressive taxation, that dismantle redistributive policies that limit intergenerational wealth transfer, like inheritance taxes. You start to see the dismantling of anti-trust protection, of labor protections, all of the things that are pluralistic and generate public good in favor of private good for an increasingly small cohort of increasingly wealthy and powerful people.

0:54:39 SC: But how does this lead to the anti-vax movement? I'm missing that...

0:54:42 CD: Well, it creates industries that only have four or five players.

0:54:46 SC: Okay.

0:54:46 CD: That have so much money that when the FDA is considering what to do, chances are everyone who's working at the FDA in a decision-making role used to work for one of those companies. And those companies have a lot of money to spend lobbying the FDA. That's the same way you get the network discrimination policies under Ajit Pai. It was one of the most expensive regulatory adventures in the history of any industry, killing net neutrality. And it was only possible because Comcast had cornered these monopolies, and AT&T had cornered these monopolies, where they divided up America and said, "We will serve here and you will serve there and we won't compete with each other. We'll make more money by charging more for delivering less and systematically under investing in infrastructure, pocketing money intended for infrastructure build-out in rural places and underserved places without delivering that either, and we will use some of that surplus capital. Most of it will go to our shareholders, but we'll use some of that surplus capital to continue to lobby for even more lax rules that benefit us more at the public expense. That at its core is just truth-seeking exercises option. That's what it means to influence policy for parochial rather than public benefit is to turn the truth-seeking exercise into an option.

0:56:01 CD: A really good example of this that makes it super obvious that truth-seeking can become auctions is in West Virginia. And we think of West Virginia as coal country; it's actually chemical processing country, it's the... The major industry there is chemical processing. And because of monopolization, one company is larger than all the rest put together, and that's Dow Chemicals. So Dow... When the chemical industry lobbies West Virginia, it's Dow lobbying them. So the chemical industry's lobby group lobbied the state of West Virginia for variances in the national limits on how much toxic run-off from chemical processing could enter the drinking water supply. And they argued that the national levels were too restrictive and that West Virginia could have less restrictive rules. And they argued for that on the basis that the national levels were set based on the national BMI, and that West Virginians [chuckle] are so much fatter that the poison would become more dilute in their tissues.

0:56:53 SC: They can soak up more poison, yeah.

0:56:53 CD: And that also West Virginians hardly drink water. This was their basis for asserting this. Now, you're a Californian, you remember before we had recreational marijuana, we had medical marijuana and in theory you could only get medical marijuana, if you had a condition that demanded it. But in practice, there was a box on the form that you were supposed to write depression or trouble sleeping or glaucoma and...

0:57:13 SC: The promise.

0:57:13 CD: And then they would hand you the medical marijuana card. The actions of Dow Chemical and their lobby in West Virginia are the actions of a company that knows that they don't need a good reason, they just need a reason. They literally could have written... My daughter, I was helping her with her math homework last night. One of the questions was like, "Evaluate whether y equals x squared would produce this point on a line and explain your reasoning." And she said, "No because I said so." Right?

0:57:39 SC: Well that's reasoning. Yeah.

0:57:42 CD: If it is the action of a firm that understands that a truth-seeking exercise is all they need us to say, "Because I said so," and that is reason enough.

0:57:51 SC: And so... But just make it super, duper clear. In one part of the story, the monopoly power leads to corporations or interests that want to promulgate a line of truth, but there's also a reaction against it because you don't know who to believe anymore, and that's why...,

0:58:07 CD: Sure.

0:58:07 SC: So there's no big corporation that is promoting anti-vax but it becomes an epistemologically respectable position, if you can't believe anything you hear.

0:58:16 CD: Sure. Yeah. It becomes... What we have to fall back on is someone who sounds trustworthy rather than a process that is legitimate. And someone who sounds trustworthy can be someone who tells you a true thing and then says, a bunch of false things. I once had a meeting with this guy, David Allen, who wrote this great productivity book called "Getting Things Done," it's a really good book. And I read it and it helped me get a lot more things done. It's how to use cool checklists to manage your time. And I said, "Where do you get that core method? How did you develop it?" And he said, "Oh well, I took all the best stuff from Scientology."

[laughter]

0:58:50 CD: Because when they make you a Scientologist, the first thing they do is teacher a bunch of stuff that works, and then they start sucking money out of your wallet, right? You know, con artist, they call this putting someone on the send. So when you fall under the spell of a con artist the first thing they do is pay you, right? They get you to participate in a scam where you win. And they say, "Oh let's... Now that you see that the scam works, we're gonna put you on the send. Go home, cash out your kids' college fund, get a second mortgage, and empty your 401k and come back and we'll do it again." So this is putting you on the send, right? Donald J. Trump says the system is rigged, you go like, "Huh... "

0:59:28 SC: It kind of is rigged.

0:59:31 CD: "It is rigged."

[laughter]

0:59:31 CD: Yeah. And then he puts you on the send, and he says, "Climate change is a Chinese hoax and it's all immigrants' fault." And the fact that he's told you one true thing makes the other things more credible. It's like Douglas Adams and the towel, right? If you're in the Hitchhiker's Guide to the Galaxy, if you have a towel, people will go, "Gosh, if he's got a towel. He must be really prepared."

[laughter]

0:59:48 CD: And then if you say, "Well, I've lost my toothbrush and my soap, and my shampoo, and my backpack and my rail pass." They'll give you everything else because you've got the towel.

0:59:57 SC: You have that credibility. So is there a strategy at the individual level for dealing with this world we're in, where it's harder to know who to trust?

1:00:04 CD: I think that ultimately that is the great crisis, right?

1:00:07 SC: Yeah.

1:00:07 CD: That without that, it is very hard to know what to do and when to trust someone. And I can't evaluate all the truth claims that I need to get on the right side of... In order to survive and thrive, to know whether or not the Index fund that my 401k is in is being run by grifters. And to know whether or not the Common Core curriculum that my daughter is going through is gonna prepare her adequately for life. All of those things. Like I can't personally [1:00:37] ____...

1:00:37 SC: Medical care?

1:00:37 CD: Medical care, yeah, all of that stuff. I'm a Canadian and so medical insurance just freaks me out. And when we got medical coverage here, we let my wife's employer talk us into getting a health savings account, which turns out to be the biggest scam in the world, right?

[laughter]

1:00:54 CD: And I was like, "The way you describe it sounds reasonable." And when I Google it and I get a bunch of contradictory things. All I know is that last year we put thousands of dollars into this thing that they then clawed back on January 1. But what do you mean? I was saving that in case I needed a hip replaced."

[laughter]

1:01:09 CD: So the individual can do very little except band together with other individuals to lobby for structural changes. We can muddle by as best as we can. In the same way... I have colleagues in the digital human rights world who talk about cryptography as a way of defending human rights. They say, "Well if you can keep secrets from powerful states, then you can keep them from knowing who you are and what you do and using that to punish you." And there's an element of truth to that. But defenders have a harder time than attackers, right? Eventually you will make a mistake, right? The state that wants to survey you can afford to have three shifts of agents watching everything you do. Whereas when you get to hour eight and you've got sand in your eyes, and you're seeing double and you're tired and hangry, and you're making dumb mistakes, you don't get to rotate another you on to make sure that you never recycle a password or download a firmware blob without double-checking the hash of it or any of the other things that you need to do to have perfect operational security. So, for me the role of cryptography is not to create a kind of stable demi-monde that lives alongside an unaccountable, illegitimate state.

1:02:31 CD: It's to create a kind of temporary shelter that we can hide under while we organize to make the state more accountable and legitimate. That ultimately all of us are vulnerable to what's called rubber-hose cryptanalysis, right? Your cipher may be so strong that all the computers in the universe... Every hydrogen in the universe turn into a computer guessing what your password is or your pass-phrase is, would run out of universe before you ran out of pass-phrase combinations. But if someone can tie you to a chair and hit you with a rubber hose until you tell them the pass-phrase it doesn't matter.

1:03:07 SC: Right.

1:03:07 CD: So what you need is not just cryptography, but you need the rule of law. You need for it to be illegal for your government to kidnap you and tie you to a chair and hit you with a rubber hose. Now, cryptography can help you defend yourself against a legitimate state, to organize to make that law a reality, but not forever, right? And so really, there is no substitute for the legal code that sits alongside the technological code.

1:03:33 SC: And does that help also... It's not only that I want to be right and it's difficult to know who to trust, but I want other people to be right too and I don't necessarily trust that the typical median voter, for example, is trying that hard even to be right. What are the systematic strategies we can use just to share truth more widely throughout our society?

1:03:53 CD: So, I would disagree with that characterization.

1:03:56 SC: Okay.

1:03:57 CD: There are some people who are on the sidelines of some of these questions. People who might don't feel strongly one way or another about vaccines but they do what the doctor tells them. But you will be hard-pressed to find a casual anti-vaxxer. Anti-vaxxers aren't anti-vaxxers because they don't care about the truth. Anti-vaxxers, to harden that position requires really deep effort, right? A lot more energy than most of us put into most of the technical aspects of our lives. To be an anti-vaxxer, if you think about vaccines as like just one of the many health interventions that are made in your life over the course of your life. To be really into it, I mean, it's nerdy, right? It's like being really, really, really into... Differential transmissions or something, not just into cars, but into a really specific kind of spark plug. And so it's not that those people don't care about the truth, it's that they're wrong, and the method by which they determine the truth is flawed, which is why they're wrong, but it's not that they lack fervor to discover the truth.

1:05:07 SC: So giving them better ways to figure out what the truth is, is the biggest part of the...

1:05:10 CD: Creating legitimate systems that we can point at and trust is... Sometimes people would get on about the 50s and its complacency, and they... The time in which everyone trusted their government even as their government was engaged in dirty wars in South Asia, or Post-War Reconstruction, colonialism in the Pacific Rim or whatever. But the legitimacy of the state is a very important asset for a functional society, that... Illegitimate states create low-trust societies that are polarized and that are incoherent in how they arrive at the truth. We hear a lot about Russian interference, and I think that it's overstated, if for no other reason than if a basket case failed state like Russia can tip us over into a chaotic world like the one we live in, then our world was more fragile than it should have been...

1:06:14 SC: That's right.

1:06:14 CD: Our state was more fragile than it should have been. But one thing that guides the oligarchic movement in Russia is a media strategy that is centered around making it hard to know what the truth is, not just about pushing an alternate line of truth, but to making the truth itself seem unattainable. So Putin's media strategist cheerfully admits that some of the opposition groups in Russia are ones that he's secretly funding and that he's planning to provoke towards [1:06:45] ____...

1:06:45 SC: More noise, more confrontation, and more controversy.

1:06:47 CD: But he won't tell you which ones.

1:06:48 SC: Yeah, of course.

1:06:49 CD: And this is why it is worrisome to think that maybe Black Lives Matter groups on Facebook, some of them were not legitimate or were insincere, whatever it is Facebook calls them, because it means that everybody has to wonder whether or not the group that they're in...

1:07:10 SC: Is this one legit?

1:07:11 CD: Is legit. It's very destabilizing to our power of collective action, and so having a legitimate state, a state that is accountable and transparent and legitimate and pluralistic is... There's no substitute for it. And once you have that state and then you start to abuse it, you can coast for a while, but eventually, a crisis builds up, and that's where we've arrived at.

1:07:40 SC: Well, I'd like to end... We're reaching the end of our time a little bit here.

1:07:43 CD: Sure.

1:07:43 SC: I'd like to end on an optimistic note.

1:07:44 CD: Sure.

1:07:45 SC: We're not succeeding right now, what is...

1:07:46 CD: No, I have an optimistic note for you.

1:07:47 SC: Yeah, what's the optimistic note?

1:07:48 CD: So there's a copyright scholar called James Boyle who runs the Duke Center for the Public Domain with Jennifer Jenkins, and Jamie wrote a book called The Public Domain where he talks about information politics by means of the metaphor of environmental politics, and he says that before the term ecology was coined and came into widespread use, you had people who cared about whales and you had like people who cared about fresh air and you had people who cared about tailpipe emissions and you have people who cared about the ozone layer, but they didn't think of themselves as all being part of one fight, they thought of these as each of them as separate fights, and the term ecology welded them together into a movement. It made every one of those causes aspects of the same fight, that... We don't even have another way to describe it except to call it an ecological fight, and that is what is mobilizing millions of people in the streets. The people who care about climate change, some of them care about it because they care about drinking water, and some of them care about it because they care about natural habitats, and so on and so on, but they're all together because they all agree that these are facets of the same struggle.

1:08:56 CD: While, inequality and monopolies and corruption have produced constituents from so many different domains for pluralistic reform. So there's one company that makes all the eyewear you've ever heard of. It's called Luxottica. They bought every single eyewear brand. First they bought all the major retailers, Sunglass Hut and Sears Optical and Target Optical and LensCrafters, and then they refuse to carry any eyewear brand that wouldn't sell to them until they drove them to their knees and picked them up for pennies. And literally, if you've got any of the major eyewear brands, it's Luxottica, and if you bought from any of the major retailers, it's Luxottica, and if your lenses came from SLR, which is the largest lens manufacturer in the world, they came from Luxottica, and if your eye insurer is EyeMed, which is the largest insurer in America, it's also Luxottica. And they've raised prices a 1000% in 10 years.

1:09:46 CD: And so there are a bunch of people who wear glasses who care about pluralism and monopoly. But there used to be 30 wrestling leagues, and now there's one, and it's owned by this billionaire, Trump donor, called Vince McMahon. He reclassified all the wrestlers as contractors, took away their health care, they're dropping down in their 50s, and GoFundMe is full of famous wrestlers begging their fans for money to pay their medical bills. So if you're a wrestling fan, you're with the eyewear people.

1:10:09 CD: Now, there are three talent agencies left in Hollywood, and all of the screenwriters fired all of their agents because the talent agencies are now all owned by private equity funds, and those private equity funds have decided that to increase their return on investment that the agencies are gonna start doing what's called packaging, where they package a writer and a director and a whatever and they collect a fee from the studio for that package, and in exchange they agree to take less money for each of their clients. And so every screenwriter in Hollywood is there for you to fight for wrestlers and eyewear.

1:10:44 CD: And then there's technologists who care about the web only having five general websites filled with screenshots from the other four. And there's people who work in the energy sector. And there's people who work in finance, so the whistle-blowers who got fired for refusing to help Wells Fargo defraud their customers, and then there's the two million customers that Wells Fargo defrauded. So over and over again. This is the eleventh anniversary of the bail out. Everyone who's pissed off because their house was stolen by a mortgage issuer who used robo-signing to forge the documents needed to steal their house is there for you on this corruption fight. They just don't know it.

1:11:18 CD: So we are on the verge of going from denial to nihilism, of going... Of that peak denial... We are on the verge of having a potential mass movement. And our job is going from convincing people that there's a problem to convincing them that it's not too late to do something about it. And our political discourse is shifting so fast. These issues are bubbling up in all the major parties. The right-wing has suddenly discovered anti-trust, because for years and years, they're okay with it because their paymasters were getting rich from it, and then one day they woke up and they realized that if five tech executives decided they didn't like them that they disappeared from the internet and thus from our public discourse, and all of a sudden Tucker Carlson is sounding like Brandice.

[chuckle]

1:12:00 SC: Yeah.

1:12:00 CD: And so I'm not saying, first, they came for the Nazis and I did nothing 'cause I wasn't a Nazi, and then they came for me, I'm saying, first they came for the trans activists and then they came for the pipeline activists, and then they came for the sex workers, and then they came for all kinds of people that I cared about, and we did nothing. Now they finally came for the Nazis, and Tucker Carlson wants to do something. Maybe we can get something done. And my colleague at EFF Jillian York is the person who tipped me off to this, and she's so right about it. And so we are really on the verge of a moment in which everybody figures out that this is the issue of our day, that climate change denial, and vaccine denial and bridges falling down and infrastructure under investment and the racist policies that have created the refugee crisis in Central America, that all of these are aspects of corruption and that we are all on the same side when it comes to fighting corruption, except for a tiny minority of billionaires who are themselves increasingly of the opinion that they need to find expensive bolt-holes to hide in when people start building guillotines on their lawn.

1:13:05 SC: I like... It's an interesting perspective, because when you try to cast the political battle as Soak the Rich and decrease inequality, there's people who think, "Well, maybe I will be rich some day," but if you say break up monopolies and decrease corruption, very few people think of themselves as future monopolists.

1:13:20 CD: Yeah, that's right. And yes, temporarily embarrassed millionaire have always been a problem in American politics, as have the... What the left called bootleggers sometimes, the people who are like, "Well, there are social betters. We should just let Elon Musk get on with it because Iron Man is the greatest." But, yeah, nobody idolizes monopolists, except for Peter Thiel who will tell you that monopolies are efficient because competition just wastes so much resource. Why can't we just have our wise philosopher kings?

1:13:53 SC: Plato, I was just gonna say Plato said the same thing about democracy. Anyway, plenty to think about. Cory Doctorow, thanks so much for being on the podcast.

1:14:00 CD: My pleasure. Thank you.

[music]

8 thoughts on “69 | Cory Doctorow on Technology, Monopoly, and the Future of the Internet”

  1. Excellent interview! I’ve run across Cory in past podcasts but this is the first in depth interview I’ve heard and I really enjoyed the level at which he has thought about the issues you discussed. His clarity of thought made it all the more enjoyable. I’m inspired to go do a deep dive into whatever Doctorow content I can find.

  2. Great conversation, but Cory was wrong about HSA’s (I’m guessing he confused them with FSA’s, where the money set aside is lost at the end of the year). Given the subject matter, I found it ironic that this piece of misinformation on his part threw into question the rest of his assertions.

    He is quick to explain the unquestionably subjective and diverse motivations and understandings of people who voted for Trump, despite not being a Trump supporter himself (or even a mind reader). Perhaps he should consider that if he is unable to get a basic fact straight about a healthcare option, he may also be incorrect about other, more subjective matters.

  3. I love the clarity of understanding and explanation from Cory and all your guests. I really love that I come away from each episode with useful “So what?” and “Now what?” perspectives on important things.

    Thank you for 1) bringing us all these wonderful people and their ideas and 2) for regularly asking, “What can we do? What’s the optimistic prospect?”

  4. Mr. Doctoro has some interesting observations. But it seems to me that these things come and go. Martin Luther, Marie Antoinette, Carl Marx. The Brits, the Mongols, etc. In the end democracy and capitalism seem to be the best models we have and can only be preserved through the rule of law equitably administered. Doctoro should be less concerned about Microsoft and more concerned about a corrupt free press (not referring to Fox News). Also the cheap shot at Tucker Carlson of Fox News, suggesting he in any way would support Nazi’s, deserves an apology.

  5. Pingback: Sunday things – Jeff Juliard

  6. The information provided about Twitter messages for the purpose of delivering a message without trace is completely incorrect. As per Twitter’s website: “When you delete a Direct Message or conversation (sent or received), it is deleted from your account only. Others in the conversation will still be able to see Direct Messages or conversations that you have deleted.” (https://help.twitter.com/en/using-twitter/direct-messages)

    However, a Tweet in 2012 from @TwitterSupport mentioned: “If you delete a direct message, that DM is removed from both the sender and recipient’s history. ” (https://twitter.com/twittersupport/status/253611215190896642)
    I can see if there is confusion over time, but do check your facts before promoting them.

  7. Pingback: Stuff The Internet Says On Scalability For November 1st, 2019 – AWS Feed

Comments are closed.

Scroll to Top