283 | Daron Acemoglu on Technology, Inequality, and Power

Change is scary. But sometimes it can all work out for the best. There's no guarantee of that, however, even when the change in question involves the introduction of a powerful new technology. Today's guest, Daron Acemoglu, is a political economist who has long thought about the relationship between economics and political institutions. In his most recent book (with Simon Johnson), Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity, he looks at how technological innovations affect the economic lives of ordinary people. We talk about how such effects are often for the worse, at least to start out, until better institutions are able to eventually spread the benefits more broadly.

DARON-ACEMOGLU

Support Mindscape on Patreon.

Daron Acemoglu received a Ph.D. in economics from the London School of Economics. He is currently Institute Professor at the Massachusetts Institute of Technology. He is a fellow of the National Academy of Sciences, the American Academy of Arts and Sciences, and the Econometric Society. Among his awards are the John Bates Clark Medal and the Nemmers Prize in Economics. In 2015, he was named the most cited economist of the past 10 years.

0:00:00.0 Sean Carroll: Hello everyone, and welcome to the Mindscape Podcast. I'm your host, Sean Carroll. As I'm recording this, I recently saw a tweet that tagged me by Gary Marcus, previous Mindscape guest, where he is retweeting a little clip of an interview with Sam Altman, the CEO of OpenAI, the AI company responsible for GPT and other various services, in which Altman says he doesn't make a firm prediction, but he asks us to imagine a day coming soon, when you can just go to the computer and say, solve all of physics. And it'll be able to do it. Now, both Gary and also I, if you have been listening to anything I have to say about AI, do not think that that moment is anywhere near. It certainly doesn't really betray an intimate knowledge of how physics works or what you need to solve all of physics, which includes things like experimental input, not just a large language model, but maybe someday, maybe not nearly, but someday, something like that could happen. It's absolutely plausible that AI kinds of models might help us increase the pace of scientific progress.

0:01:16.1 SC: So, should I be worried for my job? Is the question. And probably not, in the short term. The short term, including the rest of my career as an academic, but some people are very reasonably worried for their jobs in other fields. Journalism, of course, is something where AI has been encroaching rather rapidly. You can get a large number of things that look like news stories on the internet and have been completely AI generated. It's not very good. It's not as good as getting a human being to write something. As again, as I'm recording this, there was just an amusing set of notes that were going around Twitter, X if you wanna call it that, because on X, there is an AI agent called Grok, which tries to summarize the tweets of the day, and one of its headlines was scandal.

0:02:08.2 SC: I'm not quoting this from memory, but the idea of there's a scandal on the LA Lakers, the basketball team, because LeBron James had been accused of sleeping with the mother of one of his teammates. Oh, my goodness, what a dramatic soap opera situation that was. But of course, what Grok, the AI did not figure out is that it was a joke, and people did say that, but they were kidding because, of course, LeBron James' son Bronny had just been drafted by the Lakers. So indeed, LeBron James was sleeping with one of his teammates mothers, namely LeBron's wife. Not really the cause of any drama there. But if it's cheaper to produce, even if it's not that good, maybe it can be good enough that major corporations would just like to turn out slightly less good product, whether it's journalism or anything else, rather than pay all the money for human beings.

0:03:04.8 SC: So the idea that certain jobs will be displaced by either AI in particular, or just technology more generally, is a very, very real one. Something that is very, very worth thinking about, even if physicist is not immediately among them. So this worry, what are we gonna do if technology changes the job situation very dramatically? Throwing a lot of people outta work is one of the recent occupations. Preoccupations of today's guest, Daron Acemoglu is one of the most cited economists out there today. He's an economist at MIT, author of various books. And his most recent one is written with Simon Johnson, and it's called Power and Progress: Our Thousand-Year Struggle over Technology and Prosperity. And they point to a number of historical times when new technologies have been invented. And ultimately, these technologies are very good, right? Like the industrial revolution has overall improved the world, improved life expectancies, improved wages and living conditions and things like that.

0:04:10.8 SC: But they say, look, it wasn't straightforward. There was a moment in the middle of the industrial revolution when the average person was negatively affected by this new technology. And so what is going on? What are the forces that make things worse on the short term, and then ultimately make them better? And there's a story of elites getting lots of power and using them to extract wealth from the rest of the world and the rest of the world kind of fighting back through institutions that curb the ability of elites to do that. So it's a very intricate and interesting, provocative economic analysis of what's going on in this situation. And you can disagree with it. This is all sorts of messy social science stuff, but as usual, for Mindscape, what I care about is, let's think about it because this is coming. I don't think that AI or other technologies that are changing our lives are simply going to leave the world unaffected. It is going to change things. If we can anticipate better, what those changes might be, and what kinds of things we can do to ameliorate the worst possible consequences of those changes, we will be better off. So let's set about doing that. Let's go.

[music]

0:05:40.7 SC: Daron Acemoglu, welcome to the Mindscape Podcast.

0:05:42.5 Daron Acemoglu: Thank you. This is a true pleasure for me to be on this podcast, which I listen to, so.

0:05:47.3 SC: Oh, that's always good. I always like it.

0:05:48.9 DA: Particularly meaningful.

0:05:50.1 SC: Well, technology, inequality, progress, power, these are all big questions that people are debating right now. But one of the things I like about your book, Power and Progress is you do a lot of history, right? There's a lot of investigating what happened, and that's not always exactly the same as the present day, but you can learn something. So I was gonna ask you to go in on a particular historical episode, but maybe I should just say, what is your favorite historical analysis that does shed light in the present day in terms of a technological advance that had certain economic political consequences?

0:06:31.9 DA: Oh, I would say the British Industrial Revolution, but it's also very related to why Simon Johnson and I wrote the book, and why we wrote it this way. I have been doing research on technology and its effects on inequality and the labor market for centuries. It shows how old I am. But especially over the last 10 years, I kept getting a sort of versions of the same question from both some colleagues, but especially from people in the tech world or journalists. Are you saying this time is different, because we know in the past, things have worked out very well, and so if you are worried about robots creating inequality, or AI taking jobs or creating other social tensions, you must be saying this time is different. And I kept saying no, no, I'm saying this time is the same, but it seemed like people had a too rosy a picture of what happened in the past, so that's part of the reason why a lot of the book is dedicated to history. And the British Industrial Revolution is a perfect illustration of that.

0:07:48.5 DA: We are incredibly fortunate to be living in a time where we have started using industrial technology, scientific knowledge for improving many different aspects of our lives. The world wouldn't be the same without cars. It wouldn't be the same without antibiotics. You are just so much more comfortable and fortunate, relative to people say, who lived in the 1600s. Thanks to the British Industrial Revolution. But that doesn't mean that it was either an automatic or a uniform improvement. So if you look at the people who actually were the workers, industrial workers, in the first say, 90 years of the British Industrial Revolution, starting sometime around 1750 when the first sort of factories being started, started being built to around the mid to late 1840s, workers did not fare very well. Real incomes of the average worker did not increase much during this 90 year period, even though productivity rose and some people made fabulous fortunes, the real wages of the workers in the most dynamic sectors of the economy did really badly. So textile was the engine of early industrial revolution, and weavers were the sort of the most skilled, most cherished workers. And weaver wages fell to about 30% of what it was before mechanized looms were introduced, and factories became the place where you did the weaving.

0:09:37.0 DA: Working conditions generally were horrible. Working hours increased a lot. It was very disciplinarian, it was very oppressive, repressive. Life conditions were awful. People lived in much worse conditions as they move into the cities to get the jobs, life expectancy in places such as Manchester, London fell to something like 30 years at birth, just incredibly bad because the cities were so polluted, epidemics, rampant. So it was really a horrible time. Now, we remember things working out better, so that really did happen, yes. In the 1850s, we see real wages increasing, working conditions, improving, child labor being eliminated to a large extent. The factories were still not super nice places, but they started getting better. How did that happen? It didn't happen by itself. It was a very, very contentious process. First of all, it took an enormous amount of political change. The UK was a aristocracy, oligarchy, whatever you're gonna call it, a very small fraction of the population had the vote. And even when people voted, that didn't count for anything. So it became essentially a democracy in the latter half or the latter quarter of the 19th century with adult male suffrage. And then in the first decades of the 20th century adult suffrage for both men and women, that was a very contentious thing.

0:11:20.1 DA: The Chartists collected, will you believe that 3 million signatures in the 1840s to ask for the extension of the franchise and really made completely moderate demands, nothing that would smell of socialism. And the British authorities in response, put all of the Chartist leaders in jail. So this wasn't like a happy process of, okay, yeah, of course. We were in enlightened society, let's share political power. Trade unions were heavily prosecuted, and they couldn't negotiate on behalf of their or the workers and there was a real power imbalance. So that started changing. Trade unions started organizing first illegally, and then they became legal and recognized, especially as the country became democratic. And we started, and the Brits started using technology differently, not just for automating work and digging bigger coal mines where you could send children as young as six or seven, but trying to make workers more productive.

0:12:20.3 DA: So it was that fundamental change both in the power balance in society and in the direction of technology, that was so important. And that's the parallel that I draw to today. It's the same thing. If we use a technology badly, and if we let a few people control technology, control information, control data, I am afraid we're gonna do repeat what we witnessed during the first 90 years. And it's not so far fetched because the United States itself has had the most formidable increase in inequality over the last 40 years.

0:12:55.4 SC: Right. So to dramatically oversimplify here to get it into my little physicist brain, we imagine that there are some institutions in society that are protecting us from the worst abuses, etcetera. But then a new technology comes along, and it sort of gives the elites a better way to extract from everybody else until the institutions can catch up later.

0:13:18.7 DA: Absolutely. I think you as a physicist have done a much better job than I would be able to summarize anything on quantum mechanics. But leaving it aside, yeah, I think that's absolutely correct. But I also don't wanna imply, I think that's a pretty good match to what happened in Britain, because there were better protections for workers in the traditional economy for a variety of reasons. But I also don't want to imply that institutions are ever perfect or well calibrated. And if you look at British institutions in the first half of the 18th century, they were horrible in other respects. Britain was a very class-based, what, very hierarchical society. There were the elites who were viewed as superior. The people, the workers were called the meaner sort of people, just even the terminology was so demeaning.

0:14:26.5 SC: Yeah.

0:14:27.3 DA: But of course, things could get worse, because nobody had any type of protection against the factory system and the concentration of power, concentration of wealth that brought... Again, the parallels to today are pretty clear. We think of the standard oil as the octopus with the tentacles everywhere, but standard oil was nothing relative to the... Compared to the big tech companies today. Their control over so many aspects of the economy and society would have definitely made Rockefeller jealous. Now, of course you can say, oh, well these are much more enlightened elites relative to Rockefeller. All right, we can have that debate.

0:15:15.0 SC: Yeah. I'm not gonna say that. Well, I can't argue. I'm talking to you via an Apple computer, using a web browser from Google. And I bought your book over Amazon, so yes, I think it's...

0:15:29.2 DA: I am the same. I'm complaining. The only one I avoid is Facebook. So the other ones I'm completely hooked on.

0:15:34.7 SC: I avoid that too. It's just avoidable enough. I used to be on there and then I left. But yeah, okay, so we see a little bit of the pattern. Maybe one more historical example would be good. I liked the discussion of the sort of agricultural improvements in medieval/early modern times, which I knew much less about the industrial revolution. We've all read Charles Dickens' books, etcetera, but the repeats of this pattern over and over again are what drive the point home.

0:16:04.1 DA: Yeah, absolutely. And I like that example because we all, especially if you've spent a lot of time in Europe, we all look at these cathedrals and these amazing monuments and you think, wow, this is like the pinnacle of civilization.

0:16:23.3 SC: Beautiful. Yeah.

0:16:25.0 DA: And when you actually look at how they were built, it sort of highlights this very unequal nature of society and who enjoyed the fruits of technological improvements. So the middle ages were not dark technologically. There were a lot of improvements that changed how agriculture was done, how things were transported. Crop rotations were really innovative at the time, and the windmills were a fantastic technology. But in example after example, you see those new technologies are improved, but a lot of the peasants doesn't improve. And so you wonder why. Well, if you look at how that society was organized, perhaps it's not so surprising. It was a very, very unequal society, politically and socially. Peasants, many of them were in survival relationships, semi-coercive or fully coercive relationships. And all of the valuable resources were controlled by the elite, the upper clergy, some church establishment, and the big landowners. And in particular, when the windmills were introduced, they were all under the control of that group, but it was even worse than that.

0:17:51.1 DA: So the windmills, of course, were much more efficient than hen milling, for example. So they still had to compete against hen milling, and other people could come up with their own windmills or other types of mills. But the elite said, no, no, we're not gonna allow that. We're gonna say this is the better technology. You have to come and use it, and you cannot use an alternative technology. And you're gonna do it at the prices that we impose on you. And if that happens, surprise, surprise, they got all the surplus. The peasants didn't experience any of the gain. So what did they do with that money? Well, at the time, you didn't have Apple watches and Ferraris. So they did the next best thing of luxurious consumption. They built bigger and bigger monuments.

[laughter]

0:18:38.0 SC: So this is a little bit depressing, you're telling me that every time I go to a beautiful cathedral in Europe I should now lament the poor peasants whose wealth was extracted to make this happen.

0:18:50.0 DA: Ah, yeah, yeah, I think you can still enjoy the beauty.

0:18:53.0 SC: Okay.

0:18:53.3 DA: That was pretty impressive, but I think putting it into historical context is good.

0:18:58.0 SC: Okay. And on the side of the elites, how much psychology goes into this? Is it do we blame them? Are they evil? Or is it just human nature? They're given the opportunity to get more for themselves that's what they're gonna do.

0:19:11.8 DA: Absolutely. And they're not evil in the sense of Hollywood movie evil, everybody convinces themselves that they're doing something that's good.

0:19:21.4 SC: Right.

0:19:22.0 DA: They're working for the social good. So it's actually really interesting to go back and look at some of the debates, everybody couches things in the name of the common good, the social good. We're keeping peasants as peasants because that's good for the common good, if they don't, then society is gonna collapse. We're taking land away from the peasants, for example, in the enclosures movement in the UK, where land that was assigned or allowed to be used by common people was taken and enclosed and they became a new commercial land, that was justified as that's for the common good. Because this way the markets are gonna be efficiently functioning and we're gonna produce more food. And collectivization of agriculture in the Soviet Union, that was in the name of the common good. So that at least, I don't think it should make you jaded or too cynical, but it should at least give you a warning that when this tech company or that tech company says, "We're gonna disrupt healthcare industry in the name of the common good." At least you should look a little bit more into the details.

0:20:36.0 SC: Maybe the self-conception changes over time because now the tech companies are more... The people in them, there's a certain number of them who are convinced that they're just smarter than everybody else.

0:20:47.4 DA: That's right, that's right.

0:20:47.8 SC: And therefore they should be the ones making the decisions, that might not have been the justification.

0:20:51.4 DA: That's right. No, I get, I think that is actually very, very important. You see some traces of that in the British Industrial Revolution as well. So what's very interesting about that British Industrial Revolution is that it wasn't actually done by the very elite, by the aristocrats, by the wealthiest landowners. It was what the British called the middling sort of people, not the cream of the crop, the upper crust, not the meaner sort, the middling sort. So many of them were children of artisans or teachers. And with social changes, with the scientific discoveries, with shifting cultural norms, there was a big opening for these people. And they were really sort of much more forward looking, they said we can use science or tinkering in order to improve things, but they also felt entitled in the same way that you would say today, we are the smartest. They felt they were very deserving and that was sort of really rather interesting when you look at the sociology of that. Because in some of them they're very revolutionary and progressive, they were tearing down the hierarchy that existed at the time.

0:22:13.7 DA: But on the other hand, they looked down upon the workers because the workers were not those talented people who were making the discoveries. So again, draw your own conclusions. But, yes.

0:22:23.0 SC: All this does sound familiar. But okay, even if it's not inevitable progress, most of the time, I guess it's okay to generalize a little bit and say, we did correct after the technological shift happened.

0:22:35.5 DA: We did, we did, we did. Absolutely.

0:22:36.6 SC: And so even if it's not inevitable, how does that happen? How... Is there a theory of...

0:22:41.0 DA: I think that's very, very important; there is an argument that you see coming from the very left that, whatever that means, capitalism is always doomed to be horrible for workers and everything that the market economy does is bad. And I reject that because I reject that, perhaps I'm biased because I'm an economist, but I also, I reject that on the basis of history. When you look at what happened in the second half of the 19th century, that was far better for workers than in the first half of the 19th century. If you look at what happened throughout the industrialized world and three, three and a half decades that followed World War II, workers' wages increased faster than GDP. And there was a real sort of flavor of shared prosperity. So I think there are many different things that can happen, that gives me hope, but what I emphasize is that hope is not the same as blind optimism. So I mentioned the Chartists a second ago, precisely to emphasize that it wasn't self acting, that wasn't automatic. So it wasn't like, oh, we have to go through this transition period of pain and then things aren't necessarily gonna work out. There was no guarantee and in fact, I think if you ask the Chartists leaders who were just thrown into jail in the 1840s.

0:24:08.0 DA: They would have not said, oh, things are gonna work out, they would have been pretty pessimistic. So that's why I don't think we can say, oh, just let the market reap and after a while things are gonna get better. So you have to have a sort of proactive agenda of how can we fix this? How can we have better institutions to protect the disadvantaged? How can we steer technology in a more beneficial direction? So rather than just automate, can we make workers more productive? Rather than create filter bubbles and manipulative environments online, can we create more pro-democracy or better information environments online? Because after all, all else equal, of course, technology expands our capabilities, if we have better technology, better tools, better scientific knowledge, of course, we can do better for everybody. But the question is whether we will choose to do so.

0:25:15.0 SC: So let's get a bit more into the nitty gritty of how to bring some broad-based prosperity after these technological innovations. So you explain what it takes to actually get wages to improve, I don't wanna put words into your mouth, but you talk about marginal productivity, worker power, vision. You've gotta explain all those to us.

[laughter]

0:25:35.0 DA: Of course, I'll be happy. And you're doing a great job of putting words into my mouth. So keep on doing it.

0:25:39.8 SC: They're your words. [laughter]

0:25:41.0 DA: Well, let me start with this story that's is ascribed to a number of different people. So I don't know what its provenance is, but it's this sort of humorous take on the modern factory. It says the factory of the future will have two employees, a man and a dog, the man is there to feed the dog and the dog is there to make sure the man doesn't touch the equipment. So that is some business people's utopia and some people's dystopia. It's meant to be a humorous story about the factory where there will be no need for human workers, but it actually has very clear lessons about the questions that you asked. If indeed we are heading towards a factory like that and imagine the equipment suddenly gets twice as good. So we produce twice as much. Is there any reason for that factory or for that factory's owners to pay more to the man and his dog or hire more men and women with their animals? No. The humorous part of the story is that those two employees are completely dispensable, you don't need a dog, you don't need a man.

0:27:10.4 SC: Yeah.

0:27:12.0 DA: So if the factory gets twice as productive, you're still not gonna need them. So if technology will inevitably take us towards that factory, we are in trouble. So the first thing I do, therefore, is to argue against technological determinism. That what is technology? Technology is just a reflection of our collective knowledge, we learn more and we can apply that knowledge collectively to create better tools, to create different types of tools to change our environment. There are many, many different ways of doing that, and if you take that perspective and you look at history, I think you'll reach the same conclusion as me, which is that there isn't a predetermined path of that technological progress. There are many different ways in which we can do it, once we work out, that was one of the amazing things, and you'll understand that much better. All of that nuclear physics in the first quarter, first 30 years of the 20th century, the chain reaction, those amazing, amazing discoveries. Well, you can use them for building nuclear reactors for energy, or you can use them for making nuclear weapons, very different implications.

0:28:39.4 DA: So in the same way, I think, with digital knowledge and other sort of advanced knowledge we can use them exactly like that factory of the future, eliminate workers, or we can try to make workers more productive. And the marginal productivity, which you mentioned, is the economist's way of saying that. So marginal productivity refers to what a worker contributes to output. And that is very different from average productivity, which is how much output do we produce relative to or divided by the number of workers. So in the factory of the future, average productivity is huge, you have a lot of goodies divided by one worker, okay, two if you count the dog. But marginal productivity, the contribution of that worker is pretty darn close to zero, that's the humorous part of the story. So then the question is, can we use our knowledge to make that worker more productive so that we actually not just increase average productivity, but we also increase that worker's contribution. And my claim throughout the book, Simon and my claim throughout the book is that yes, both historically, more recently and also from a conceptual empirical point of view, there are plenty of things we can do to make workers more productive.

0:30:07.5 DA: After all, a huge fraction of what we do in today's economy involves social interactions, interactions between people who reason and then implement things in the real world. So hand-eye coordination would be one simple version of that, but we do many more complex versions of that in the factories or the electricians with plumbers, with driving and so on and so forth. And then we also do a lot of things that involve wisdom, judgment, creativity. All of these things right now are beyond the reach of large language models, generative AI, or all sorts of digital technologies. Social interactions, still you need the human, creativity, okay, you know, large language models are getting more sophisticated, but they're not really creative; they really don't have wisdom or judgment. And they, of course, right now cannot do anything in conjunction with the real world. So a huge fraction of what we do in the economy is therefore not doable with these AI tools directly. But then the question is, can we try to use these AI tools to make workers who do these things more productive in a way that increases their marginal productivity? And my claim is yes, because the main thing that most, not most, but a large fraction of decision makers need is better information.

0:31:42.0 DA: So if you think of AI as an informational tool, not as something that can write Shakespearean sonnets or pretend to be human. But as an informational tool that provides to you context dependent, reliable real time information, it can actually make many people more productive. If you're an electrician and you're dealing with a more complex problem, most of the time that's gonna take you a very long time to troubleshoot that, understand what's going on, unless you happen to be the very, very, very best electrician. For more of the remaining electricians, better information can make us much better. So those are the kinds of ways we can try to approach the problem of AI in a way that increases the marginal productivity of labor. That's one. And then the final, the second part of your question, let me be... I don't wanna go, keep on talking and going on and on, but power. Even if you make that, imagine that we actually use technology to make that man with his dog more productive and his contribution has increased. But we are in a situation like in the first half of the 19th century, where laws are against workers, employers are organized, they can repress workers. In the UK, there were all these masters and servants acts.

0:33:01.0 DA: So if you told your employer, "I have a better job offer, I wanna quit." They could take you to court and put you in jail. So if you're in an environment like that, you have such an imbalance of power that even if your contribution to output is very large, you may not get any fruits of that. So that's why we need both technology that helps workers and a balanced power distribution embedded in institutions and social norms that also avoids workers becoming so marginalized and sidelined.

0:33:33.0 SC: And then vision? There's lots of visions that we could have. [laughter]

0:33:39.0 DA: Lots of visions that we can have. So by vision, we mean so the aspirations of especially powerful people, but of society, of what is acceptable, what are our main objectives? So let me try to explain how we think vision is important, again, giving an example from AI. So if we have a bunch of companies who think that the most exciting thing is to try to rush towards AGI, and that's also socially acceptable and desirable, even if it creates lots of problems on the way, safety concerns, inequality and so on. That's what we're gonna get, because it's the vision of these powerful actors. If on the other hand, if they had a different vision, first they thought, well, these tools should be developed to help workers that would be different. Or if we had at the table, different actors with different visions, for example, trade unions, labor unions that said, no, no, no, let's see whether we can actually use these tools to help our workers, our members, that would lead to a different set of outcomes. And there isn't anything in the market system that would automatically say the right one is gonna emerge. The markets are great for allocating apples and oranges and who's gonna get the Apple Watch.

0:35:06.3 DA: But they're not good at choosing between different paradigms of which types of technologies are most inspiring, most achievable and so on.

0:35:17.7 SC: So, just as one trivial point before moving on, there is a problem even for the factory owners in your future factory scenario, namely, who is going to buy the products created by that factory? We need average people to have some income, even in the most unequal distribution.

0:35:35.8 DA: Well, that's what Henry Ford thought, he was very ruthless and he was no friend of workers, but he actually, he and some of his leading managers kept emphasizing, "Well, we need a prosperous middle class so that they can buy cars." And the US is having that problem right now, part of the reason why the economy is not always so dynamic, it's because some important part of the population doesn't have great purchasing power, but that doesn't bring the economy to a standstill. Look at the new things that are valued, more and more expensive luxury items, personal trainers that are hugely expensive for the very wealthy. One of the fast growing occupations is sommelier so that you can go to expensive dinners and buy the most expensive wines. So you can redirect resources towards producing the goods and services, especially services that the very wealthy will demand. So I don't think just the market system itself will create the forces to limit inequality through that channel. Of course, there will be tensions and there are tensions, but it's not self-correcting in that way.

0:37:13.0 SC: So how does it get fixed? I mean, how do we nudge things in the direction of giving workers a little bit more power helping their productivity, their marginal productivity, rather than the average? Is it we go out and demonstrate, is it we vote for the right people at the ballot box? Is it, we have a lot of angry Twitter threads about how to fix it.

0:37:33.7 DA: I don't think the Twitter threads are gonna do it, personally. Look, I think there's no formula. So the two somewhat hopeful examples Simon and I discuss in the book are progressive era reforms and what happened in Sweden after the Great Depression. So the Swedish example, perhaps, is more less familiar to your audience. So let me start from there. It's actually super interesting because I think it has some of the elements that have resonance for today. So Sweden was... Contrary to what some people might think, was a very unequal economy. It wasn't like the sort of the cohesive paradise that some people think today. There isn't anything in the genes or the history of Swedes that said, oh, they were gonna be very cooperative, and there was a big conflict between capital and labor.

0:38:43.4 DA: The Great Depression changed that balance. And in 1932, Social Democrats came to power. But very interestingly, the Swedish Social Democratic Party, or the Workers Party was very, very early, about 20 years before then completely had severed its links with Marxism. They were committed to democracy and reform rather than overthrow and expropriate capitalists. And they immediately tried to come up with an agenda that would lead to higher wages, stabilize the economy, and so on. And that's the origins of the corporatist model where business, government and labor unions came together. And what's very interesting is that once that... Sort of those negotiations started, the outcome was one that was not bad for business. So the solution that the Workers Party and the trade unions came up with wasn't... Well, let's take everything from business, 'cause they realize if we do that, businesses are not gonna invest.

0:39:54.0 DA: It's not gonna be good for the workers. So they tried to forge an agreement that was good for workers, but not so bad for businesses. And as a result, businesses thrived in Sweden. They remain innovative, very export oriented especially after the war. And there are some very wealthy families, and there's some very big businesses in Sweden. But wages grew rapidly. Inequality remained limited, actually, wealth inequality declined a lot in Sweden. Income inequality declined a lot in Sweden. And the reason why I'm pointing that out is to sort of frame the following two points. That if we... I believe if we can get into a different path where the tech sector prioritizes increasing worker productivity, providing better tools for workers, that's actually gonna be ultimately good for firms. Because for firms, if their workers become more productive, they will kinda become more innovative.

0:40:51.7 DA: They're gonna provide better services, higher quality services, they're gonna be able to expand into new areas. So it's actually good for firms ultimately as well. So long as A, we do that and we don't do it in a way that's completely adversarial, that doesn't try to expropriate the firms. So you have to provide the tools for firms that will be good, both for workers and for firms. I think that's the objective. And in the progressive era, we see, I think the same thing as well, that they were, first of all, it came out of nothing. At the time you would've been more pessimistic about the possibility or the prospect of an American political system reforming itself. Senators were not directly elected and the general perception was that each senator was in the pocket of one billionaire, or millionaire at the time.

0:41:42.2 DA: Media wasn't that free. There was no regulation, there wasn't proper antitrust regulation, there wasn't a federal income tax. There were no agencies that could do the kinds of things that we are used to today, like prevent abuses of consumers, false advertisement, false information, it was just... The conditions looked like completely against any sort of meaningful change. And then suddenly, once a movement sort of coalesced around this sort of progressive ideas of sort of better middle class life, control of the big corporations, reform, it just became self-sustaining. And not everything that came out of it was perfect, by no stretch, progressive themselves have some crazy ideas. But something that I think people would've thought impossible at the time happened. And the entire face of US society changed, much more equal distribution, corporate power became more constrained, regulatory power of the state started increasing and led to meaningful political reform as well. So, I think, again, all of these things show that non-revolutionary, meaning something that doesn't try to destroy anybody or create huge instability, is possible. And if it happens, at the end of the day, there will be many beneficiaries, not just workers, consumers, workers, the right type of firms that really focus on this, the right type of tech companies. Absolutely.

0:43:27.7 SC: Now, this does bring us to a huge question in history and economics and politics, which is the relationship between material conditions and ideas. The story you just told gives a certain amount of credit to both certain ideas that were in people's heads. And the ability of people to convince and persuade of other people, other people of the correctness of their ideas.

0:43:54.9 DA: Absolutely.

0:43:57.5 SC: Should we give ideas that much credit? I'd be happy to do so. But there's certainly another argument that just says that's just sort of window dressing on the fundamental material conditions underlying it all.

0:44:09.9 DA: No, I am... And this is actually a very, very important discussion within social science. And I am on the side of ideas, and this is one of the things that is very difficult to formalize and put some discipline, both empirical and conceptual discipline. And that's something I've been trying to work on. But if you look at history, you see ideas matter. Let me give you another example since you like talking about history. Which I do like talking about too. If you go to the conditions that prepared the British Industrial Revolution, I said the middle class people... The middle link sort of people felt sort of liberated and could have these high aspirations. How did we get there? Well, if you look at the 17th century Britain, it was a completely different world. It has this thing called the divine right of kings, where the king was the reflection of divine power on earth and could do anything he wanted. There would be no possibility of the king doing something wrong or anybody constraining the king. There was this great chain of being that there is a top, and everybody else sort of followed as their subservient part is very, very structured society.

0:45:39.2 DA: And then you have the English Civil War and everything just gets changed at almost one fell swoop. You have these groups within the new model army that Oliver Cromwell had built, diggers, levelers. They'd come up with these amazing democratic ideas like universal suffrage, and these ideas start spreading. And you see, it's like the ideas are actually struggling with other ideas. People like Thomas Hobbes, John Locke, are sort of giving more respectable versions of these ideas, and they're convincing people. And by 1688, when the Glorious Revolution come, there is a critical mass of English people who think you should have constitutional monarchy and a constrained government. So as a complete change, and you cannot just explain that by material conditions change, it's really sort of some ideas sort of became much more acceptable.

0:46:50.3 SC: Yeah. And good. And I'm completely persuaded by that. So I'm not gonna be able to push back in any effective way, but it does... It bumps up against another contentious word in this discussion, which is power. Right. And I've seen in various reviews of your books that people are unhappy with how you think about power. So I'm give gonna give you a chance to explain it.

0:47:13.8 DA: Please do, yeah, yeah, yeah.

0:47:14.3 SC: But, in some sense, I think again, one could make the case most power is kind of soft power. It's when one person says something, other people are gonna do what they say. Right. And if the very few people have literally like death raise and whatever, and they're forcing people to do things, it's that they have some authority that comes from somewhere. So how does that fit in with this story? I mean, especially if you think the elites are the ones who have the power, how do we go against their purported self-interest?

0:47:45.8 DA: Well, first of all, we need to understand power, and it's a really difficult problem. So I haven't read all of the critical reviews of my book, but I'm sure I could guess that there are critics from all sides on the issue of power. Because there is a school of thought again, very much in line with Marxist thinking, but not just confined to Marxist, which is power is rooted in course of power. Your power comes from your ability to have troops and guns and kill me. And there are pure economists although I would say misguided economists. And so who think power should never be any part of an economics' discussion.

0:48:41.8 SC: Really?

0:48:42.4 DA: Yes, which is weird.

0:48:44.8 SC: That's weird.

0:48:44.9 DA: Abba Lerner, the famous... Sort of a famous economist of the mid 20th century wrote economics became the queen of social sciences by focusing on politically solved problems. So what he meant by that is essentially not worry about power, because once you ignore issues of power, then we can think about markets, prices, etcetera. And there we have a lot more to say. But he was essentially pointing out many problems aren't politically solved. I would go further, I would say there are parts of economics unfortunately, that led us astray by taking problems in which power matters a lot, and pretending as if power didn't matter. So, for instance, thinking of the power of trade unions and workers as part of shared prosperity, I think is very, very important from the examples I provided, and I could provide more, but some economists, not many today, or not everybody, certainly, would say, oh leave all those power issues aside and let's just think about of technology and markets.

0:49:55.5 DA: And I'm trying to say, no, that's not enough. So, that's why you have to be somewhere in between where power matters, but it's not just military power. So it's that soft power. Where does that soft power come from? Well, I would say nobody understands that, unfortunately, even though I think there will be a lot of reasonable people who would say, yeah, of course, it's not just military power. We don't have good theories or where that non-military power comes from. We could call that ideological power. You could call that soft power. In the book, we call that power to persuade. And I think if you want to understand power to persuade, you need to think about social psychology. What are the kinds of ideas and what are the presentation of ideas that would make certain power ideas more palatable, more acceptable?

0:50:48.6 DA: You need to talk about... Think about social networks. It is social networks that elevate and propagate ideas. You need to also think about who's at the table. Like the more we hear an idea, the more prestige an idea has the more appealing it becomes. So, then power to persuade really interacts with institutions. If today we had trade union leaders being on TV everyday, we would think about workers' conditions and things like that differently than when we see tech leaders on TV everyday. So, I think those are the issues, but I would certainly not say that we have any comprehensive answer to these questions, but we do try to sort of highlight the importance of the power to persuade. And I think it's even more important today because in the 19th century, it is reasonable to think that tanks and guns really matter for many things. Although at the end, if you look at the details, it's not... Many important things happen without tanks and power that as well. But today, the tech industry's power has nothing to do with tanks and guns. It's really about their power to inspire, the power to persuade, it's the power to get things done. And then somehow as a fait accompli, the rest of society accepts that. So, we need to understand those dynamics a little bit.

0:52:14.6 SC: Is there a kind of power in the fact that when I am faced with a 20 page agreement for some software that I sign up for, I just go to the bottom and click accept, and now I've agreed to giving away a whole bunch of my life.

0:52:31.9 DA: I would say that that is a very important part. That's a great question. I don't know the answer to that. I would say that's an implication of how we have been persuaded. We have given our trust and we've said, I don't matter as an individual, as a single individual, and hence whatever is most convenient for me, that's what I'm gonna do it. It's also very much about media. Do we continuously talk about tech leaders in the same way that we talk about movie stars, that elevates them and that increases their soft power.

0:53:09.0 SC: And not enough professors are thought of that way. It's very frustrating.

0:53:14.7 DA: We're not that charismatic, unfortunately. Yes, Sean. I know.

0:53:19.1 SC: Okay. But then making this sort of ideas that get into people's minds and persuade people, making it into policy involves institutions. This is a word that you... One of your favorite words, I think. And you have this very...

0:53:34.6 DA: I probably overuse it. Yes.

0:53:36.3 SC: Yeah. Love it. But you have this very nice distinction between extractive institutions and inclusive institutions. Maybe explain that for a little bit.

0:53:44.0 DA: Yeah. I mean, I think in the same way that I tried to argue a second ago about technology and other things, it's like it's the choices that we make that matter. So it comes to everything. It's the choices that we make as a society that matters. Who gets political say? Who gets political power? Who gets the economics resources? How do we allocate the capabilities of how to use those resources and the limits on what you can do to other people? Those are mostly about the institutions that we built. Those are the formal and the informal rules. And when you look at history, everything is complex, but you see these two big sort of different types of institutional ensembles that are quite visible. One, just like the divine rite of kings that I was mentioning. They are institutions that are designed or have evolved in such a way that enable a small group to extract as much as they can. So the Stuart Monarchs institutions, both economic and political, were there to empower them. They were politically empowered. The king had as much control as possible, and that's why many people think they were trying to become as absolutist as the... Sort of the French version at the time. And they had a lot of economic resources and access to economic resources.

0:55:25.5 DA: That is an example of an extractive economic institutions, extractive political institutions. Those enabled them with little constraint to extract things from the rest of society. They could be... You can extract labor, you can extract assets, they extracted financial resources, you can extract minerals. But it's the same principle with these extractive institutions. Whereas what we argue, the defining feature of inclusive economic institutions is that they include more people by empowering them, for example, have more secure property rights so that a king or a monarch or a government can extract their assets or their labor also provides the means to become part of the economy. Like an inclusive institutional system is not possible unless you have, for example, opportunities for people to become whatever type of worker they want to become. If you go back to the South African example, before the fall of Apartheid you had the color bar. The color bar said if you are a black South African, you cannot do pretty much anything other than being an unskilled laborer, in mines, in agriculture or in factories. The color bar explicitly prohibited black South Africans from becoming, for example, bricklayers or plumbers or supervisors. So that is another form of extraction, meaning that you actually are forcing people into the lowest pay...

0:57:02.5 DA: Lowest skill occupation. So an inclusive economy also enables you to pursue your skills. So it creates the ecosystem for that. But then we argue an inclusive economic system and inclusive economic institutions are very difficult to maintain if you don't have inclusive political institutions. Meaning in the same way that you distribute economic capabilities broadly, you try to distribute political power broadly. So those are the inclusive political institutions. And I would say, there, we didn't get into these issues of vision and soft power, but if soft power is very much in the hands of a very small group, that's going to start bearing down any inclusive system that you have. And I think that's the danger in the United States where the history of institutions in the United States is complex due to slavery due to a variety of other things. But by and large US has been democratic, US has created opportunities for people. But once soft power gets very concentrated, that's a big tension for inclusive economic and political institutions.

0:58:14.5 SC: What set of things should the audience have in mind when we use the word institutions? I mean, obviously government, corporations, maybe unions, but do schools count? Does the media count?

0:58:24.8 DA: Well, I think, yeah, certainly the political system counts as part of the political institutions. And then there are political norms. They are very adjacent to institutions. But depending on the context, you may or may not want to separate them. So you would, for example, say that in the US political norms have changed. It would've been much less acceptable for a president or a presidential candidate to tell lies. And it's more acceptable today. So that's a political norm, but that really affects how political institutions function. In terms of economic institutions, laws. What is it that you can write contracts on? How do we negotiate wages? So, how secure are your property rights? If the government can come, and say, oh, I like your factory. I'll take it. That's gonna be a very different type of economic institution than when private property is enshrined, for example.

0:59:21.4 DA: But also, as I've pointed out, your ability to realize your opportunities matter as well. So that's right. So the educational system is, again, very adjacent to political institutions, sorry, to the economic institutions. And when it is about media, media is a very integral part of how the political system works. Again, depending on the context, you may say, I'm gonna define institutions in a narrow way so that then institutions interact with media companies. But the media ecosystem definitely influences how institutions, political institutions function.

1:00:06.5 SC: And when you give all these historical examples, I'm sure that everybody listening can't help but instantly recognize the parallels with the present day. And you've mentioned some of them already. AI is the first thing that comes to mind. But the whole internet, all of technology, all of our interconnected universe. So, given all the lessons that we've learned, how explicit can you be about your suggested policy implementations? Like if we think that there is a sort of natural thing that could happen, which by the way, I had a solo podcast recently where I speculated that the first impact of these modern technologies will be to make extraction more efficient, [laughter] and concentrate well. So if we know that's coming, and we've seen that in the past that happened, but then we sort of equilibrated to something better after that, can we anticipate it and prevent it from being so bad?

1:01:03.6 DA: Well, first of all you're already right, Sean [laughter], because although wealth equality is difficult to estimate, because there are different forms of wealth that are not always on tax forms, economists, I'm not one of them. Economists who work on estimating wealth inequality think that it's much, much higher than it's ever been, at least in the 20th century. When Elon Musk gets a $56 billion pay package? [laughter] That's not surprising. The way I would put it is that if we start by recognizing the following, that already changes everything, which is that, A, it is socially desirable, and B, it's technically feasible to have a different direction of AI. So, socially desirable, meaning that relative to where we are going right now, if we can use AI for helping workers, helping information, helping the disadvantaged, that's much better.

1:02:21.4 DA: And B, that it's actually technically feasible. It's not like a crazy dream to say we can make workers more productive using AI. I gave examples. Or we can make democracy work better rather than worse. [laughter] Well, again, we have examples. Taiwan provides a really inspiring illustration of how you use AI in order to build new democratic platforms, better ways of regulating and keeping accountable government officials, getting people into new public squares. I think at a small scale, we see all of these things in Taiwan, so it's not a complete crazy dream. So if once we recognize that, then I think it opens up the discussion that every time you hear a tech leader saying, oh, you have to give another $7 trillion to me because that's the only way we're gonna build a better future, then you can say, no, it's not a better future. Because the current direction has got a lot of problems in terms of our democracy, in terms of inequality, in terms of who is in power and who is disempowered.

1:03:26.5 DA: And the alternative is feasible. That opens up the discussion. And once we have that discussion opened up, I think we can think about, well, if we're going in the wrong direction, how can we try to steer technological change in a more beneficial direction? And I think that's where institutions really matter. You and me saying that I think is great. [laughter] You've been very nice. But that's not enough. You really need countervailing powers. Where did the countervailing powers against car companies came? Where did countervailing powers against car companies come in the past? Well, it came from trade unions. They couldn't do whatever they wanted in terms of wages and working conditions. And it came from regulators. Car companies weren't so enthusiastic about putting better brakes, seat belts, better tires for safety reasons. They came from regulators and with democratic pressure. So those are the polls of countervailing powers against any type of concentrated interest. So we need civil society. We need the democratic process to work better and we need workers organizations to be more at the table. And that's actually very important for another reason, in my opinion. Imagine, go with me that it is possible to make technology that's more pro worker.

1:05:00.5 DA: Let's suppose we agree on that. And we say, all right, now we're gonna instruct tech leaders to go into their basement or into their nice office wherever it is that they're gonna do it and come up with technologies that are gonna help workers. But they're gonna do that in isolation without talking to workers ever. That's not a very likely scenario. First, they're not really gonna understand what the workers are doing. A lot of that implicit tacit knowledge is with the workers. They know what works, what doesn't work, what are the particular problems that they're having where they need more information. So actually you need worker voice, both for a more equitable division of the gains, but you also need worker voice to actually know what to do with our technological capabilities. So it's, I think this really opens up a lot of possibilities that are just completely sidelined at the moment.

1:05:52.2 DA: And then we also need some specific policies. If we are not happy with how we are using social media at the moment, for example, well, we can think about, well, why are we using social media that way? Why is it that the data of billions of people in the United States, hundreds of millions of people is being taken and monetized in this way that creates all of these emotional outrage and all of this extremism, all of this mental health problems? Well, think of the business model that we've come up with. We've come up with a business model where we sell you digital advertising, and that requires that people are glued to their screen. And the best way of doing that is to play with people's social psychology. So we should discourage that. Wikipedia doesn't do that; Wikipedia doesn't do emotional exploitation.

1:06:45.1 DA: Well, perhaps there are ways of making companies more like Wikipedia or even Netflix more than Twitter and Facebook. So digital ads, taxes would do that, for if we tax digital ads that would open up ways of new business models to emerge and trying to see whether they can compete with better things. So, for instance, I mentioned the Taiwanese model. I don't think any of those technologies could survive today if you throw them into the market system in the United States, because they're gonna have to compete against Facebook that has this amazing way of monetizing people's data. It's already collected the data. So, you need to create a level playing field in the same way that you had to go against the standard oil to create a level playing field during the progressive era. So I think digital ad taxes as well as antitrust are parts of that.

1:07:37.5 SC: I'm not sure I can agree that Wikipedia does not emotionally manipulate me 'cause I start reading about mitochondria and two hours later I'm reading about the Dead Sea Scrolls. And it might be emergent rather than intentional, but I feel very manipulated there.

1:07:52.5 DA: I like that. I like that. Okay. I will make an exception on Sean.

1:07:57.4 SC: But before it leaves my mind, are we thinking more about shaping what technologies come to be or just letting the advance of science and technology do that and then shaping how they're used?

1:08:10.6 DA: That's a very, very important question. Thanks for asking that, Sean. Because I completely agree with the viewpoint that science is not something you can control from the top. Science is an evolutionary process. New ideas need to come, they need to compete against others. It's gotta be a decentralized process. Whenever you try to centralize science, that's not gonna work. So I am not claiming that we should have a Bureau of Scientific research, and it should say, everybody should work on this problem. That being said, there are so many things we can work on, at any point in time what we focus on is a social decision. And it may well be that, that social decision shaped by scientists, career concerns, monetary incentives, business models, and so on and so forth, may not be the right one. And let me try to illustrate that with the energy sector. So if you were an energy researcher in the 1990s, the most obvious thing for you to work on would be on making the internal combustion engine better or making coal extraction cheaper, or making oil furnaces work better. Why? Because those were the only technologies in town, that was the only technologies in town. They were the only game in town.

1:09:44.3 DA: Today, we actually have renewable technologies, solar and wind, that are cost competitive with fossil fuels in electricity generation. How did we get there? We got there not just because the market process suddenly decided by itself to allocate more resources; it's because of policy. First, there were a bunch of regulations in a number of countries, including from California in the United States. There was Germany that passed this first legislation that required more investment in solar panels. There was innovation subsidies to people working on renewables. And then in a number of countries there were carbon taxes. There's very little, but even that little thing led to a complete redirection of energy. You see, look at renewable patents around 2000s, they start skyrocketing. And Chinese companies were the first ones to see the opportunities created by the German legislation. So they started building bigger factories for producing solar panels. And then they started exploiting learning by doing, so they got better and better at cutting thinner and thinner wafers and making better solar panels.

1:10:57.8 DA: So it was that process that pushed us into a path in which renewable technologies started getting better and becoming cost competitive. The cost of generating energy with solar panels dropped by more than 10 tenfold in about 15 years. So that does not mean that the government should be in the business of deciding how you do your solar panels, but it can try to steer it a little bit. And if you look at big successes in the past, the government has had some agenda setting role. The Manhattan project, the aerospace, antibiotics. Those would not have happened without government involvement. And again, the important thing is you have to respect the decentralized nature of science and have the right sort of hands off approach, but try to steer it in a more socially beneficial direction. And we need more of that in the energy sector. If we can repeat what we've done for electricity generation, for in steel factories, or for making synthetic fertilizers, aluminum. So that's gonna be a complete game changer. So I am in favor of more innovation subsidies to clean technologies and experiment with different things, but again, while respecting the decentralized evolutionary and creative nature of science.

1:12:14.9 SC: So it's kind of like tax policy. You don't want to tell people what to spend their money on, but you wanna like, maybe a little more taxation on the things we think are bad. [laughter]

1:12:23.3 DA: Exactly. 100%

1:12:24.9 SC: A little shaping the constraint surface there. Okay.

1:12:28.9 DA: Exactly.

1:12:29.0 SC: Good. So maybe the last big idea that I wanted to ask you about, 'cause we focused a lot on your recent book, progress. Oops, now I'm forgetting the name of the book, Power and Progress. But you famously had a previous book called Why Nations Fail, which is a provocative title. I'm betting that even though it sounds like a different topic, that in fact there's a lot of relationships between the ideas here. Could you sort of tease those out a little bit?

1:12:56.5 DA: Yeah, absolutely. And in fact, you already very nicely sort of talked about that a little bit with the extractive and inclusive institutions were from Why Nations fail. And that was essentially focused much more on understanding why is it that we live in such an unequal world? The field of economics is shaped by Adam Smith's Wealth of Nation, an amazing book. Adam Smith was a really amazing thinker, economist's moral philosopher. When he was writing, probably the gap between the richest and the poorest nation was about fourfold or something. [laughter] Today that's like 60 fold.

1:13:47.4 SC: Yeah.

1:13:48.1 DA: So the world has become much more unequal in many dimensions. And certain countries have completely failed in taking advantage of knowledge, technology, opportunities, whatever they are. And we argue that's about these extractive and inclusive institutions, and why this is also very much entangled with power issues. The subtitle of Why Nations Fail is The Origin of Power, Prosperity, and Poverty. So we argue that's actually with my other frequent collaborator, James Robinson, that we cannot understand the issue of national failure without thinking about politics, economics, and power. And I think, again, there is a lot of overlap between that issue and what's going on in the US today, both in terms of problems of our institutions. Are they becoming more extractive? Is democracy functioning less well? Are we destroying institutional guardrails that we have built slowly over time? And it's also very much related to who controls soft power, who control economic resources, who sets the agenda, and how the gains are distributed. All of these were themes we started talking about in Why Nations Fail. But of course, that book was written and completed in 2010, published in 2012. So the world has changed. Yeah.

[laughter]

1:15:15.4 DA: My thinking has changed to some degree. The beginning of the book, I remember, we completed the book. We were almost ready to send it to the publishers. We were just proofreading it and the Arab Spring happened. And so we wrote the preface on the Arab Spring where we were cautiously optimistic. Things are changing even in the Middle East. Well, of course, today the Middle East doesn't look so much more hopeful, unfortunately. And we discussed the US and we are pointing out some problems in the US but we were broadly optimistic that US institutions and institutional flexibility would be able to overcome some of these challenges. Well, today, I would be less optimistic about that. And the tech sector's dominance was not as easy to see in 2010. So all of these tech related issues are not in the book. And of course, they couldn't be, all of what tech implies, ao as such a broad topics, not just one book, but you need several big efforts to understand it.

1:16:34.8 SC: So the word power just keeps appearing in the titles of your books. But the good news is that that includes that power that you and I and our listeners have to persuade our fellow citizens to try to construct better institutions. So that's a suitably optimistic note to end on, I think.

1:16:50.6 SC: Well, that's a wonderful, wonderful note to end on, but I'll say it's much more your power than mine. [laughter] You're doing a great job with this podcast, and I'm really happy that you enabled me to talk about these issues. But hopefully, you have more power than I do.

[laughter]

1:17:08.9 SC: Well, we'll see about that. But I do appreciate that very much. Daron Acemoglu thanks very much for being on the Mindscape podcast.

1:17:16.0 DA: Thank you. Thank you, Sean. It was a great pleasure.

[music]

3 thoughts on “283 | Daron Acemoglu on Technology, Inequality, and Power”

  1. Here are a couple of the excellent suggestions made by Daron Acemoglu:
    o Instead of programing AI to replace human workers, use it to train them to be more productive.
    o In the long run the sharing of power, profits and responsibility between employers and employees can be beneficial for both.

  2. Great to see this – I was at the university of York with Daron 86-89 and he was super smart 🙂

Comments are closed.

Scroll to Top