Within every person's mind there is on ongoing battle between reason and emotion. It's not always a battle, of course; very often the two can work together. But at other times, our emotions push us toward actions that our reason would counsel against. Paul Bloom is a well-known psychologist and author who wrote the provocatively-titled book Against Empathy: The Case for Rational Compassion, and is currently writing a book about the nature of cruelty. While I sympathize with parts of his anti-empathy stance, I try to stick up for the importance of empathy in the right circumstances. We have a great discussion about the relationship between reason and emotion.
Support Mindscape on Patreon or Paypal.
Paul Bloom received his Ph.D. in cognitive psychology from MIT. He is currently the Ragen Professor of Psychology and Cognitive Science at Yale University. His research ranges over a variety of topics in moral psychology and childhood development. He is the author of several books and the recipient of numerous prizes, including the $1 million Klaus J. Jacobs Research Prize in 2017.
0:00:00 Sean Carroll: Hello everyone and welcome to the Mindscape Podcast. I'm your host, Sean Carroll. And a few months ago, I went on Twitter and took a brave contrarian stance that empathy is a good thing. Many people, of course, would agree with this automatically, but many other people have read a book by today's guest, Yale Psychology Professor Paul Bloom, which is entitled "Against Empathy", which I admit is a completely awesome book title. And Paul and I actually discussed this on Twitter. He's an extremely reasonable, thoughtful guy, and we do in fact disagree. His point is that empathy, the ability to think about things from someone else's point of view, to put yourself in their shoes, sounds good and maybe it makes us nice people on a personal level, but it gets in the way of being rational, moral thinkers.
0:00:48 SC: We tend to empathize with people close by, with people like ourselves, rather than being purely rational about how to be the best people, how to live in the world correctly. I, on the other hand, tend to emphasize the fact that people who think that they're being rational will often take things into account that make perfect sense from their point of view while perhaps not paying as much attention to things that are front and center to people who are living very different lives than them. To me, if you're really going to be rational, it is absolutely crucial that you are empathetic, that you try to understand what other people who have very different experiences than you have been going through.
0:01:28 SC: So Paul and I talk about this on the podcast. I don't think we are necessarily coming to any agreement. But on the other hand, I don't think we disagree that much either. We're trying to emphasize different aspects of a problem. And Paul is an extremely interesting guy. I think you'll get a lot out of this conversation, and hopefully, we'll all be more moral and good to each other at the end of it. So, as always, please check out the Mindscape webpage at preposterousuniverse.com/podcast where you'll find links to Patreon and PayPal and keep the podcast going. Also, you'll find transcripts so you get to see all the words, and you can even search through all the archives to see what cool stuff, previous Mindscape guests have said. And with that, let's go.
[music]
0:02:25 SC: Paul Bloom, welcome to the Mindscape Podcast.
0:02:28 Paul Bloom: Thanks for having me here.
0:02:29 SC: I wanna actually thank you especially because a few months ago I was having dinner with our mutual friend Carl Zimmer...
0:02:34 PB: Oh yes.
0:02:35 SC: Who was a previous Mindscape guest, and Carl mentioned that every time he runs into you, you say like, "Boy, I was just on another podcast. There's just so many of these things." [laughter] So I presume that you're quite in demand in this medium.
0:02:49 PB: Well, yeah, but there's a couple that stand out and yours is one of them. So...
0:02:53 SC: That's good to hear. Good.
0:02:55 PB: I know of you through Carl and through your popular books, and so I'm just totally thrilled to be here, and you and I have had a run-in on Twitter, if I remember.
0:03:03 SC: Yeah, if you wanna call that a run-in. Yeah. I mean I've had run-ins. That didn't seem like a run-in from my perspective. It was a nice little chat.
0:03:09 PB: A nice little chat, but that sort of sets the stage for us to talk in person.
0:03:13 SC: Yeah, that's right. So there's lots of things to talk about obviously, I mean psychology is a pretty broad area and you've worked on, work in a lot of areas. Let me float a hypothesis and you can either shoot it down or sharpen it up.
0:03:28 PB: Okay.
0:03:29 SC: A lot of your work seems to deal with what we might think of as the boundary between reason and emotion or almost the battle lines between those two things. Is that a fair characterization?
0:03:40 PB: It really is. I mean particularly in my recent work, I've always been interested in how much of our natures are based on emotion and feelings and gut reactions and to what extent could rational liberation play a role. So this crosscuts across a lot of my research.
0:03:57 SC: Yeah. So what's the answer? [chuckle] How rational can human beings be? I think this is actually a more involved topic than just a simple yes or no answer.
0:04:06 PB: Yeah, but I'll go for it. I think we're a lot more rational than most of my colleagues think we are. I don't doubt that a lot of our feelings, our judgments, particularly our moral judgments are influenced by our emotions. We're often very irrational indeed, certainly politically. I think politics brings out the worst in us. But I also think we have a tremendous capacity for rational liberation, and in fact, you see this at its best in sort of your day job in science, in actual science that shows that humans can, under optimal circumstances, be rational and deliberative and come to real insights about the world. And I would make the stronger claim that we could do this when it comes to morality, when it comes to politics and in our everyday lives.
0:04:51 SC: Good. So this is a perfect starting point for the conversation because as I have grown older, and whether or not wiser, my faith in human rationality has only diminished over time. So I'm very happy if you are gonna be able to convince me otherwise. Why don't we start just, you know, it's a broad audience from a lot of backgrounds. Most people probably think of themselves as pretty rational. What are the reasons that we've had from psychology and elsewhere to start to doubt that we're perfectly rational beings?
0:05:24 PB: It's a good way to begin. There's a lot of sort of striking demonstrations from psychology that we aren't as smart as we think we are, and I think the foundations of this work come from the psychologists Danny Kahneman and Amos Tversky.
0:05:39 SC: Right.
0:05:40 PB: And Kahneman eventually won the Nobel Prize for his work in economics, and what those two did was put together a really impressive body of work showing that we're highly susceptible to cognitive illusions and cognitive biases. We are biased. We make mistakes. We depart from logical norms, mathematical norms in such ways that we could often be tricked and people could exploit us.
0:06:09 SC: Right.
0:06:09 PB: And so there's a lot of those demonstrations which tend to be sort of more mathematical logicy demonstrations, but some of them are quite striking. For instance, one typical one is we radically overestimate the odds of the likelihood of infrequent, yet very salient events, like terrorist attacks or being bitten by a shark, and you radically underestimate the relative likelihood of pretty frequent events like heart conditions and dying of a coronary. So that's one tradition. And then the second tradition is more recent, and this is social people like Jonathan Haidt who argue when it comes to our moral reasoning and political reasoning, we might think that we have arguments. You might think your views on abortion and the death penalty and Donald Trump are because you're a deliberating being, but there's a lot of demonstrations suggesting that actually your views on these things were determined by other factors and the reasons you give are after the fact justifications. So that's why a lot of my colleagues are skeptical about the power of reason.
0:07:09 SC: Yeah, I think that that fits in with the very first episode. We had a Mindscape where I talked to Carol Tavris who's a social psychologist who has done some work on cognitive dissonance, and she wrote a wonderful book with Elliot Aronson called "Mistakes Were Made, But Not By Me".
0:07:23 PB: Yeah.
0:07:23 SC: All about how we justify the decisions we make, and even if the making of the decision was rational, the justification of it would go forward whether or not that original decision was rational.
0:07:35 PB: Yes, that's right. That's right. And there's all these sort of laboratory demonstrations, everyday demonstrations, suggesting that in some way, we don't live our life as sort of comic book scientists running experiments in the world and coming up with theories. Rather, we're kind of like, to use Jonathan Haidt's term, we're like lawyers. We have to explain something. We have to make a case for something after it happened.
0:08:00 SC: That's right. We have to justify it. And in fact, in the science fiction realm, we caricature Mr. Spock and Commander Data in the examples of people without emotions, perfectly rational, or they're always the ones learning from we human beings 'cause we're wise and we have our gut instincts and our emotions are very helpful.
0:08:18 PB: Yeah. I mean in science fiction, I think even good science fiction, really bungles emotions.
[chuckle]
0:08:24 PB: So I know... So part of... You capture one part of it, which is somehow Data and Spock are fatally flawed in some way, which they needed to touch, the spark of emotion that only a Captain Kirk has...
0:08:37 SC: Yeah.
0:08:38 PB: That humans have. On the other hand, I think that they terribly miss out on the importance of emotions as motivators. If Spock and Data really had no emotions, why would they get out of bed? Why would they do things? Why would they... Why wouldn't they just sit in a lump in a fetal position and think thoughts?
0:08:57 SC: Well, as David Hume said, "Reason is the slave of the passions," right? We need some emotion to give us goals, and then reason should tell us how to achieve those goals, roughly speaking, right?
0:09:07 PB: That's right. And even people like me who wanna push the importance of reason would argue that you need a kick in the pants of emotion.
0:09:18 SC: Right.
0:09:19 PB: I've argued that we're often much better when it comes to morality when we try to think our way through and not follow our gut. But even I would admit that if we didn't have gut motivations to care about other people, you could come to all the wise moral decisions you want. You wouldn't do anything.
[chuckle]
0:09:35 SC: Well, good. But I think that... So that is an important fact about the role of rationality is that when we invoke it, it's in the cause of some goal which arguably is given to us by emotions. But what you're saying about rationality is something a little bit different, I guess, which is you're saying that despite these social science studies saying all the different ways in which human beings justify ourselves and make logical mistakes, you're optimistic about the extent to which we're reasonable.
0:10:05 PB: That's right, and I'm optimistic for all sorts of reasons. One reason is we have certain domains where rationality has proven to be very powerful and very useful, science and technology being one example. If we really were such stupid creatures and so reflective and so biased, how did we come over time to have such a rich understanding of the universe, of domains that we were not evolutionarily prepared to know about? Yet somehow we managed to figure it out. I think more controversially, morality is another domain. We're a lot better people than we were 1000 years ago. And part of it is I think the exercise of reason over time.
0:10:43 SC: Yeah. So good. Let's dig into this a little bit because I like doing science first. I think that science is the perfect example that psychologists and philosophers should take care of because if we can't figure out how science works, then it's hopeless, like science is the most straightforward thing that we have. It's much easier than morality. And you, as you say, the power of reason has taken us pretty far in science and understanding the world, but then a lot of people on the streets are not very good at science. We human beings need to invent all sorts of tricks and double-blind studies to prevent ourselves from giving in to our biases. Is there any way of quantifying whether or not we're overall rational or irrational?
0:11:29 PB: Yeah, in some way, I push back a little bit. Science is in some way, a very strange domain. I mean, for the reasons you gave, it's very unnatural. It's this bizarre thing maybe we stumbled on to and has turned out to be of extraordinary use. And I think science illustrates something which I think brings together your observation of demand on the street with my pushing of reasoning, which is that reason will flourish in social institutions where it's valued. And often we're sensitive enough to... Of our own weaknesses that we can intelligently create institutions and mechanisms and procedures like a refutation and experiment and blind review and so on, that can lead us to do much better. And I think we see this in other domains as well. So if I'm aware of my racial biases, I could choose to use a procedure like blind review or a quota system or something to override them. If I'm aware that I love my kids more than I love other people and would favor them, I could support a system which bans nepotistic hiring. And so this is sort of a very human story, where we have these weaknesses, but we're also smart enough to think our way through and override them. And science is this great example of how people manage to do that.
0:12:57 SC: Yeah, no, I think I mean part of it maybe it's not that there's any disagreement or even difference going on here, but just a choice to emphasize one thing or another, right? Like in some sense we are burdened with some level of irrationality, but we are blessed with enough rationality we can overcome that when it's really helpful.
0:13:17 PB: That's right, and where you draw the line and emphasis you put on things matters a lot. So a lot of my colleagues emphasize the rationality of our political behavior, our voting behavior. My temptation is to be more skeptical about these claims of irrationality and argue for instance that a lot of people's political behavior seems irrational because psychologists miss out on the fact that people aren't actually aspiring towards the truth. If I really support the Red Sox and I cheer them on and you to say, "Well that's not rational, they're not favored to win." [chuckle] Then you're kind of missing the point. My cheering them on is not this objective assessment where being right matters. My cheering them on might have to do with the community I'm with and who my friends are, and who I wanna support.
0:14:06 SC: And that seems to be very compatible with what I've read about political allegiances that a lot of times even when we're asked factual questions we give answers, because in our mind, subconsciously or consciously, those are the right answers to give given our political side.
0:14:22 PB: That's right, so you take somebody who has a view, left or right, say about climate change. And you say, "Oh, well isn't it crazy? This person doesn't know what they're talking about, and they just have the view that other people have," but your average person doesn't have much stake in being right about climate change or any one of another political issues. They have very little say in what happens whether I have the right view or wrong view will have no influence on my life or anyone else's life, but it will matter a lot for my friendships and my relationships if I hold a certain view. And so people see... I think people think of politics may be not unreasonably as akin to sports. And so they cheer on their team, they boo the other team. For a while... I'll take one example. Years ago there was, a lot of polling that found a lot of Republicans said they believed that Obama was born in Kenya. Turns out there was actually polling where they've asked people where was Trump born? And a lot of Democrats say Trump was born outside the United States.
[chuckle]
0:15:25 SC: I did not know that one, okay.
0:15:27 PB: And I think what they were saying is, they were saying, "I haven't studied these issues, I'm saying boo Obama, boo Trump."
0:15:33 SC: Yeah. Right. And there's even studies where if you offer them a nickel for getting the right answers, suddenly their answers change, right.
0:15:39 PB: Exactly, when you shift incentive structure and make it that it's actually for once important to be right, then people start behaving differently. And I actually think when it comes to local politics, people are much smarter. They might vote their self-interest, but they're actually genuinely curious and engaged in the facts because they matter, whether or not you put a stop sign on the corner, or you put in sidewalks, or you pay more for snow removal, well, that's the sort of thing where it actually does matter to my life and my views do matter and then I become more aligned to their truths.
0:16:12 SC: But let me see... If I've read I think most of Kahneman's book, "Thinking Fast and Slow", which is one of those wonderful books that once you read it, it changes the way you think about the world.
0:16:21 PB: It's brilliant.
0:16:22 SC: And the impression I get from it, is that not being a trained psychologist, I think a lot of us have a vision of a human being or at least ourselves as, like it or not, a little homunculus inside our brain, pushing around the levels, the levers to make us do things and that's not a very good picture 'cause what's inside the homunculus' brain.
0:16:42 PB: That's right.
0:16:43 SC: But the more psychologically accurate picture is that there's lots of competing sub-processes going on, and there's lots of heuristics and shortcuts and common-sensical rules of thumb that we use, rather than using the full force of our reason to get a right answer, and then give it in a single unified consciousness.
0:17:03 PB: I think that's very much right. Though Kahneman himself has long defended a view that says that there are two general systems and each system probably composed, is composed of many systems. So one system is what you're talking about now, it's a series of gut instincts and emotional responses, heuristic and bias, heuristics and biases. But the second system is a deliberator. It's a slow, careful reasoner. And in much of life we use the quick emotional system. In fractions of a second, we make judgements, we let our biases go, but we can slow it down, when given a puzzle, a bat and a ball cost together $1.10. The bat costs 10 cents more than the ball. How much does the ball cost? Have you done that one?
0:18:00 SC: No, but I could do it. I've read the book so I know how to slow down and get it right.
[chuckle]
0:18:05 PB: And I realized under this pressure, I probably mangled the example, but there's all sorts of things like that where there's an immediate answer, and then there's a slow and reflective answer.
0:18:14 SC: That's right.
0:18:15 PB: And to get back to what you were saying before, people do differ. There are people who are very reflective in their lives and people who tend to go more for their gut, but I think that, which one you are depends in part in your nature, but it also depends on the environment you're in and your upbringing, and so on. But this illustrates that, that in addition to the sort of hodge-podge of heuristics and biases that you're talking about, there's also something else in our heads, that's closer to an ideal reasoner at least in principle.
0:18:45 SC: Well, right. So in this picture, Kahneman, I think if I'm getting the numbers right, it's a system one, which is all the heuristics and biases and system two, which is the deliberator, the cognitive part. Is what you're trying to say that we're more system two than we give ourselves credit for, or that system two is more powerful when we choose to use it?
0:19:07 PB: Both.
0:19:08 SC: Okay.
0:19:09 PB: Both. And I think system two is a really important part of the story. Psychologists like to say, "Look, people make all sorts of dumb mistakes and we talk all about these dumb mistakes.
0:19:23 SC: Yeah.
0:19:24 PB: But what psychologists... So one example I saw is that, I think it was Burger King, opened up in response to the quarter pounder, they had a third pounder.
0:19:35 SC: Yes, I've seen this one.
0:19:36 PB: And then the idea was, nobody bought it, because they figured a third pounder must have less meat than the quarter pounder 'cause three is less than four.
0:19:43 SC: Yes.
0:19:45 PB: So, people, we love that stuff. But here's what we're missing. We're missing the fact that I could tell it to you, and we could laugh about it, because there's another part of ourselves that knows that's a mistake.
0:19:56 SC: Right.
0:19:57 PB: And so, we're put in a kind of odd position of almost laughing at people and ourselves for our stupid mistakes and forgetting about the fact that we know they're stupid mistakes.
0:20:06 SC: Yeah.
0:20:07 PB: And this is not, it's not a minor problem. I think our psychologists focus on the irrational has left us largely unprepared to explain things like moral and scientific and social progress.
0:20:20 SC: Right.
0:20:21 PB: Which often work on deliberation, and even if system two, the rationalizer, only works 1% of the time, it's a very important 1%.
0:20:31 SC: Sure. And is this pushback that you wanna give, the pro power of reason kind of thing. Is that based on studies, or experiments that you're doing, or are you just sort of surveying the existing art there?
0:20:47 PB: So, in my own work I've been most interested in this, in the domain of morality, and the basis for my critique, my critique of... Basically my descriptive claim is we're capable of moral reasoning, much more than many of us think. And in my normative claim, I claim that how we should live our lives is we really do better, we're better people, morally better, when we we rely on our reasoning rather than our gut feelings.
0:21:13 SC: Yeah.
0:21:13 PB: And so in my other work I've sort of argued against that emotions like disgust lead us morally astray. More recently, I have argued this about empathy, argued that empathy often leads us to make bad and immoral choices. And the basis for my argument is, to some extent philosophical, arguments and examples from philosophers showing the limitations of empathy and to some extent experimental. There are some very clever experiments by Dan Batson and many others showing how, illustrating in very sharp terms, how relying on our feelings leads to terrible and immoral mistakes.
0:21:52 SC: Right. Yeah. So that's... You wrote a book "Against Empathy", one of the great titles of the history of books.
0:21:58 PB: Thank you. I have some ambivalence about my title. It catches people's eye, but it's had no end of trouble.
0:22:05 SC: It took me a while to learn the lesson of being an author that the title is... The purpose of the title is not to summarize the book, it's to sell the book...
0:22:13 PB: Yeah.
0:22:13 SC: And to get people to read it. And I think that it worked for your book.
0:22:15 PB: Yes. Although the purpose of my title seems to be to provoke people to send me unhinged emails.
0:22:21 SC: Oh, I'm sorry to hear that. Yeah, that's too bad. But if they do that after buying the book, then their job here is done.
0:22:26 PB: Right. On balance, I don't regret it.
0:22:29 SC: Good. So and I do think... So this is where we started to disagree on Twitter, and I think we still disagree but again, I'm gonna be open up here to you changing you my mind once and for all.
0:22:38 PB: Well, same here.
0:22:40 SC: Tell me... Let's be systematic. How you define empathy and why you think that we give it too much credit?
0:22:47 PB: Yeah. I mean, one of the worst things about this topic is I'm forced to start in the most boring of all ways by defining my terms.
0:22:53 SC: I know. [chuckle]
0:22:54 PB: 'Cause people use the term empathy in all sorts of ways, and a lot of people think empathy just means goodness and kindness and morality.
0:23:00 SC: Right.
0:23:01 PB: And if they're using the term that way, I'm not against it, and I'm all in favor of being good and kind and moral. I'm meaning in the reasonably narrow sense but it's how a lot of psychologists and philosophers use it, which is, getting in another person's shoes, feeling what they feel, seeing the world through their eyes, feeling their pain. And this seems like a very good thing, and there's all sorts of examples where it could be a good thing. But the problem is, when you come to your moral views and your moral decisions based on zooming in on, through the perspective of another person, several things happen. For one thing, it's very biased. I'm much more likely to feel empathy for you, a fellow professor, maybe same age, same ethnicity, than I am to feel empathy for somebody who is a different color skin, or a woman or lives in a far-off land, speaks a different language. So there's that kind of bias. I'm more likely to feel empathy for the attractive than the ugly, for the safe than the scary, empathy is innumerate.
0:24:00 PB: And so, empathy draws us to the fate of the one, but leaves us neutral or ignoring the fate of the 100. And just to kind of wrap it up, because empathy is sort of so biased and can be focused, it's often used as a tool for what I think is our unpleasant ends. Last night, our President gave a speech to the nation where he argued for a border wall. Now, putting aside, it's not a topic for here whether the argument is a good one or a bad one, but putting that aside, what you'll notice is, Donald Trump uses empathic arguments all the time.
0:24:41 SC: Sure.
0:24:41 PB: He talks about victims. He talks all the time of victims and people who are murdered and people who are raped, and this is a tried and true method to generate animosity towards a group. So, when some people think of empathy, they think of charity, I tend to think of war.
0:24:58 SC: Yeah.
0:24:58 PB: Empathy is just used... It can be weaponized and often is, and certainly is in these times.
0:25:04 SC: Yeah, that actually harkens back to a podcast I did with Yascha Mounk where he talks about the word 'populism', and you might think that populism would be a good thing.
0:25:15 PB: Yeah.
0:25:15 SC: We have a democracy. You should do what the people want. But in practice, populism is all about defining some people as not the people, right? As outside the set of people who really matter.
0:25:26 PB: I think that's right, I think there are all sorts of things that feel good, but if you look at how they're used in every day life, they have terrible ends.
0:25:36 SC: Yeah, and again this might end up being one of those things where in fact we're in complete agreement, but are just choosing to emphasize different things. But let's play that out and get it on the table. So I think I understand your argument, and in some sense, much of what you say is just obviously true, right? Clearly, we are more empathetic for people who are like ourselves, and therefore that causes trouble. Is it safe to say that what you would like to emphasize instead of empathy is a more strictly rational approach to morality?
0:26:09 PB: Yeah. I think that there are always gonna be... I'm not pretending that I have the solution to moral problems. I don't know whether I'm utilitarian or deontological, there's a whole lot of problems that I'm not pretending to solve. But when we rid ourselves of making decisions based on empathic pull, it leads to changes in our moral judgements that I think everybody would agree are better.
0:26:36 SC: Yeah.
0:26:36 PB: I think there's nobody who would defend the idea that an attractive person's life is worth more than an ugly person, or that one life is worth more than a 100. So yeah, my claim would be that the alternative to empathy in that regard, is thinking through, realizing what your moral goals are and trying to figure out what's the best way to achieve them.
0:26:55 SC: Okay, good. So two concerns or two thoughts in my mind. First is, I can't help but think that the problem that you're raising is not that we have empathy but that we don't always do it right, or almost that we don't have enough empathy for the right people, right? Like, if it's true that I have a lot of empathy for a small child drowning in the pond that I'm walking by, and I don't have empathy for starving millions of people in Africa, then wouldn't an alternative different strategy just be, well, train yourself to have more empathy for the people who are in Africa.
0:27:34 PB: Yeah, I think that's very reasonable. There are two issues with it. One is, I think there's some problems with empathy which really are intrinsic to empathy. So empathy by its very nature zooms you in on single individuals.
0:27:48 SC: Right.
0:27:49 PB: So empathy is not a mental system that deals with numbers, and to the extent that some moral judgments should deal with numbers, that a thousand lives are worth more than 10, empathy leaves you silent. In fact, it's kind of worse, 'cause empathy prioritizes a single individual you zoom in on and at the expense of faceless individuals who you don't. Now, you could imagine if God tried to make decisions through empathy, and God could empathize with a billion creatures at the same time, maybe it would be less of a problem.
0:28:21 SC: Yeah.
0:28:22 PB: But we're limited, and empathy is for us a spotlight, and spotlights shine brightly on spaces, but they have their limits. And then just the second point is it's true if we could apply empathy in an unbiased fashion. It would be... Many of the problems I'm talking about wouldn't arise. But we can't. It's sort of like I said, "Look, you have to decide on a graduate student or a new faculty member to hire. Just choose the one you like the most, but try to make it so your liking isn't biased." Well, if your liking wasn't biased, if your liking was fully on the merits, then liking is a good measure, but that's not how people work. You're gonna like somebody who kind of is charming and acts and is interested in you and comes from a similar place. So a better system is trying to make it so that you don't use liking. You use more objective criteria.
0:29:20 SC: Right. But you know in some sense, we started with this discussion of the relationship between reason and emotions and how emotions sort of provide a starting point for reason to then say, "Okay, here's our program. Let's carry it out." Couldn't there be a similar situation with empathy, where empathy gives us a starting point, and then reason takes over and says, "Well, wait a minute. It's a little bit unfair just to be empathetic with this person. Let's take the thing that we started with and be more rational about it."?
0:29:48 PB: It could. I mean, empathy could be a spark for good actions. You could see somebody suffering, feel their pain and then move to help them, and your helping them could be mindful of other people. It could be respectful. It could have none of the problems I worry about. I don't doubt that that can happen. On the other hand, you could feel empathy for a group of people and it could lead you towards violence towards another group of people.
0:30:12 SC: Yeah. I think that the point that you make, that empathy tends to be singular, that it's easier to have this response with a person with a face and a description and so forth, that's a very good point and there are very few sensible moral systems that say that should be our motivation.
0:30:29 PB: That's right. That's right. And sometimes people say, "Well, are you pushing a Peter Singer-like consequentialism or is there certain moral position there?" I have my biases, but to see that empathy leads you astray, I think you could make the case that, no matter where your morality is within a certain reasonable range, you'd say, "Hey, that's not the right way to do it."
0:30:49 SC: Right.
0:30:49 PB: But to go back to your point, you're entirely right, and I would happily concede that empathy could spark you towards good moral actions. So too for anger. Anger is a very interesting emotion. Anger sometimes gets a bad rap and I think we can do better than being motivated by anger. But anger at an injustice could really motivate you to do really good stuff as well. I think in general though, I try to push... The subtitle on my book, which is less in your face than the title, is "The Case for Rational Compassion".
0:31:22 SC: Right.
0:31:22 PB: And I think what people call compassion, a general desire to improve people's lives, I think that's a better motivator than either empathy or anger or disgust or shame or guilt.
0:31:35 SC: Yeah.
0:31:35 PB: But again, I wouldn't... I don't disagree that... I can name all sorts of cases where empathy, as well as anger, disgust, shame or guilt led to good action.
0:31:45 SC: Yeah. No, that's perfectly fair. I don't wanna say it doesn't matter, but there's all sorts of starting points for good things that we can get to.
0:31:52 PB: That's right. I mean suppose... You and I probably don't like racism very much.
0:31:56 SC: Racism is bad. Yeah. That's an official Mindspace platform.
0:32:00 PB: Yeah. That's right. That's right, we'll leave this... We're not gonna edit this one out. So racism's bad but it's a trivial exercise to think of cases where racism could have a good effect. If you think, for instance, if there's a horrible politician and somebody uses racist appeals to keep anybody from electing this horrible politician, and as a result, the world is so much better. Well, racism did some good.
0:32:22 SC: That's right.
0:32:22 PB: But still you'd say, "Yeah, but in general, it's a bad way to do things."
0:32:26 SC: Yeah.
0:32:26 PB: It's not attuned to morally relevant features that you wanna focus on.
0:32:31 SC: Good, this is very clarifying. I think that there certainly is a lot of truth here and the connotations of the word empathy are just so positive. This is why "Against Empathy" is such a great book title right, that it deserves a little bit of pushback just so we can... Even if it's only to think of it properly overall.
0:32:50 PB: I think it is, and it goes back to what you started with, which is, I kinda wanna make the case for reason and against emotions. In other work, I tried to argue that look, disgust is a very unreliable way to make moral decisions. But what I found was, everybody agreed with me.
[chuckle]
0:33:06 PB: I mean, a sort of liberal milieu of people who read these kinds of books and they went, "Oh, that makes total sense."
0:33:12 SC: Yeah.
0:33:13 PB: So I wanna say, "What sort of emotion would you think would be good?" And I can make the case that, even for that, you're better off with more rational approaches, so that brought me to empathy.
0:33:24 SC: Right. So let me get on to my second point then, which is actually where we started on Twitter, 'cause I tried to make a... I said that, I forget exactly what I said, but I think the point was that empathy is a crucial part of rationality, in a very real sense. And so this is what I really wanna get your opinion on, 'cause I don't think that we quite got there on Twitter.
0:33:45 PB: Yeah.
0:33:45 SC: Because I am skeptical about human being's ability to be rational, but as a trained scientist, I know that we can get there by using tricks and strategies to make ourselves seem more rational, I think that one of the most important tricks and strategies is the need to try to empathize with people unlike ourselves, and what I see over and over again in people who are, well self-described is extremely rational, and to be fair often are rational, but they're choosing to look at certain things going on in the world and just not paying attention to other things going on in the world. And I would argue it's because they are very different than the people that that's happening to. So they just discount these bad things.
0:34:32 PB: Yeah.
0:34:33 SC: I, just this morning read an article in the Paris Review by R.O. Kwon about how difficult it is for her to read novels by male authors describing female characters, 'cause they just don't get the fear that women live in walking down the streets at night by themselves. And I think that one of the major tricks that a person trying to be rational should use if they want to get a theory of morality and justice and how to arrange society, is doing everything they can to empathize with people very unlike themselves.
0:35:07 PB: Yeah, I think there's a lot of truth to that, and that kind of brings us a little bit into different senses of empathy. So, one sense of empathy isn't a matter of feeling what another one feels, I actually think, among other things, we're not very good at it. So for me to say to, well, to a woman in that position for instance who, "Oh, I could really get to understand and feel what it's like to be in your situation." Seems arrogant. But I do think, and I think you're right, I think we should make an effort to try to understand the lives of people very different from us. It makes us better people. It makes us understand the world more. It's fascinating.
0:35:48 SC: Sure.
0:35:48 PB: I mean, so much of the best literature and movies and books is the exhilaration of trying to understand a world different from your own.
0:35:56 SC: Yeah, I think, but I wanna be a little bit further than that.
0:36:00 PB: Yeah.
0:36:00 SC: I think that, because as a scientist, as a physicist, I know all of the strategies that we use. When I had Kip Thorne on the podcast, we talked about this fact that the LIGO experiment that detected gravitational waves, they had a whole protocol by which a little sub-committee would inject fake signals into the data and then the entire rest of the collaboration would have to pretend, well, they would not know whether it was a real signal or fake. So they would analyze the data, write a paper, the whole bit. Because that was the only way to be fair. And I think that, likewise, when we're talking about racism, we all agree racism is bad, but there's lots of people who think that racism is basically over.
0:36:45 PB: Yeah.
0:36:45 SC: That there aren't really bad effects from racism. And I think that those people, whatever conclusions they reach, they have an obligation to really try very, very hard to make sure they are seeing things from the perspective of people who are suffering from racism. And that seems to me to be the definition of empathy.
0:37:02 PB: I would agree with that, but I think it's important to get to causality, right? So, it's not like I'm a guy who just doesn't care about other races, I just care about myself...
0:37:11 SC: Sure.
0:37:12 PB: And I think there's no problem with race and so on. And then if, in case example, and then I somehow find myself empathizing with somebody from a far off land or a racial minority in my community that expands my moral circle. I don't think that's actually how it happens. I don't think we find ourselves almost by accident empathizing people whose lives are very different from ours. We're often very good at keeping our empathy close, closed in. I think what happens is, is people come through rationality and other means to a broader world view, a more cosmopolitan view say and then because of this, they choose to empathize with others and try to understand the lives of others.
0:37:56 SC: Good.
0:37:56 PB: And I... I hate to be too agreeable, [chuckle] but I do think if I'm in a... If I was one running a diverse lab with people with different backgrounds, I might have a fundamentalist Christian, I might have someone who has a severe hearing disability and so on. In order to run as well with both practically and also trying to be a good person, I guess I should try my best to know what it's like to be these other people. I do think though that the best way to find out isn't to go through some empathic exercise. It's mostly usually to ask them. I feel...
0:38:33 SC: That's always very good advice, I agree.
0:38:37 PB: I think there's somewhat of an arrogance to say, that somebody like you or somebody like me could feel what it's like to be in a very different situation. We try. There's a pleasure in trying, that's what literature, movies and so on. But if I really wanna know what it's like to be to have a severe disability or come from a very different culture or be a fundamentalist in a very secular society. I don't think empathy itself in the sense of sort of degrading my teeth and trying to put myself in a person's shoes is the way to go. I think what you do is you talk to the person, you treat them as... And you come to conclusion that and maybe say, "Look, I'm actually not gonna end up knowing what it's like to be a woman in a sexist society. I'm really not, I'm never gonna really feel it, but I could listen to people in that situation, talk to them about what's fair, what's decent, what satisfies principles of justice, morality, and follow that."
0:39:33 PB: So maybe I am actually being disagreeable. [chuckle] I would argue that somebody with no capacity for empathy in any interesting sense, who is willing to listen to people and who had... Who honored principles of justice and morality and fairness, could be a great moral person, and could do very well in the world. Well, someone who really wanted to feel what it's like to be other people, I actually think is running all sorts of sort of moral and practical risks.
0:40:04 SC: Good, actually, I love that. Even that's just sort of a final emphasis, I think that really crystallizes what I take to be the heart of what you're saying that I am very happy to agree with. It's interesting, the thought experiment of someone who had no empathy, I could agree that as long as we somehow gave them the right moral starting point, they could be a very moral person. And then I would say just to be troublesome about it that since none of us are that person and we do have empathy that, and we do have other emotions discussed and tribal identification and so forth, that empathy... The combination of rationality and empathy aimed at the right people is a crucial tool in overcoming our first most primitive instincts. There you go.
0:40:55 PB: Well, that's not bad actually, it's kind of...
[chuckle]
0:41:00 PB: You know the book's long been out. So I'm willing to concede some points but let me put it this way, and I'm curious whether you agree or disagree. I think that suppose you are... Just take the example that we're both familiar with. If you're running a large and diverse lab with a lot of people or department or something with a lot of people. I actually think high levels of empathy are more likely to be a hindrance than a help. They will do good in exactly the way you're talking about, it'll give you the person, the possibility of feeling what it's like to be somebody who's very different from them. But as is typically utilized, the person of high empathy will just feel a lot about the fates of people who standardly elicit empathy. It takes a lot of effort to shift empathy from the usual targets, and focus them to people who aren't the usual targets. It's like any sort of bias that there's this... There's underlying bias towards the familiar and the friendly and the people like you, that mean that people with high empathy and then people with high feelings in general will just naturally direct these feelings towards those who are like them. Yeah, they have the capacity to do otherwise but 90% of the time they'll be zooming in on that person who was just like them when they were in graduate school.
0:42:16 SC: Yeah.
0:42:17 PB: And how could you not understand that person's struggles and difficulty? Well, there's this weird person in the corner with the funny clothes and a funny accent, yeah I could imagine what's like to be them, but it's a lot of work.
[chuckle]
0:42:29 SC: Good, yeah I think at this point, we're slicing the baloney of our differences very thinly and most people get it. I mean you wanna correctly I think, point out that giving into the empathy that we naturally have can lead to just as many deleterious consequences as good ones. I wanna emphasize the fact that trying to have empathy for people unlike ourselves is a useful corrective good.
0:42:51 PB: Good.
0:42:52 SC: If people can choose their own balance there.
0:42:54 PB: We could have a Twitter poll.
0:42:56 SC: Yeah. But you're not done, you're not done writing books, you're moving onto other things. I've noticed that you're interested in cruelty these days. I'm very much hoping that the sequel to against empathy will be "In Favor of Cruelty". That will be a great book title.
[chuckle]
0:43:14 PB: No.
0:43:14 SC: Leaning into the contrarian part of the stuff.
0:43:15 PB: Yeah. I'm actually quite against cruelty.
0:43:20 SC: Oh no. [chuckle]
0:43:21 PB: But, but, but... I will satisfy your appetite for the controversial. I have heterodox views about dehumanization. So...
0:43:34 SC: Right. Which is good, yeah.
0:43:35 PB: So maybe I'll be... Maybe my book could be in favor of dehumanization or at least dehumanization is not the problem you think it is. So...
0:43:44 SC: Yeah, it's related to the empathy thing, right? Putting ourselves in the shoes of somebody else and now you're sort of turning that on its head a little bit. How do we perceive or think about the people who are not being nice to?
0:44:00 PB: That's right, and so it stems from my empathy work, which is a really sensible challenge to me is, look, if you didn't have empathy, you wouldn't really relate, fully see the humanity of other people and when you don't see the humanity of other people they argue, you dehumanize them and you don't think of them as fully human. And once you do that, that's terrible, that's morally terrible. So if you really know what it's like to be another person, you really feel what it's like to be them, you treat them as full humans, you have to treat them with respect and love. And I think...
0:44:30 SC: This is a conventional point of view, right?
0:44:32 PB: I think every word of it's wrong.
0:44:34 SC: Good.
0:44:35 PB: Dehumanizing is awful. To treat somebody as less than human is to make a moral and factual mistake. But I've been convinced by a lot of people, really smart thinkers on this like Kwame Anthony Appiah and Kahneman, and others that some of the worst things we do to other people are in full recognition of their humanity. So...
0:45:01 SC: Right. Which is a very depressing conclusion, I have to say.
0:45:04 PB: It is very depressing conclusion. The dehumanization thesis is so cheerful.
0:45:08 SC: Yeah.
0:45:09 PB: It basically says, "Look, you know all the evil in the world it's based on a mistake." If you... If people are confusingly think the Jews or the blacks or the gays or the women aren't fully human, once they come to their senses and realize that they're real people, then all this cruelty and nasty stuff will go away. It's so cheering. But I think when you look at atrocities and everyday violence and cruelty, often it's motivated by a full appreciation of others' humanity. So there's all sorts case studies, that Kahneman makes the case regarding gender violence, like domestic abuse of women by men. And she says, look when this guy hates women, or hates or beats up on his spouse, it's not that he's thinking of them, "Oh, they're non-human, they're just things, they're just objects." If you thought that, why would you wanna make people suffer, why would you be so angry at them? Rather he's responding in a way one responds to people. People could be a source of delight and love and transcendence, but people could really make you mad." If I wanna... If... Sean, if I wanted to make you suffer, well, the way that would happen, is if somehow I felt you were humiliating me, or you had done some horrible moral wrong or you were this deep threat to me, all very human things. Well, if I thought of you as just an object or a machine or an animal, well then maybe I wouldn't treat you kindly, but I'd have no interest in hurting you.
0:46:44 SC: Yeah, we don't go out of our way to abuse chairs and tables, right, like we go out of our way to abuse people.
0:46:47 PB: Exactly. Exactly. Exactly. Now again, dehumanization is terrible because we also don't go out of our way to treat chairs and table with any respect or any kindness. So a lot of mass exterminations often say, "Well, they're just vermin."
0:47:04 SC: Yeah.
0:47:05 PB: But torture, degradation, humiliation, both at sort of the atrocity level, but also at individual level, seem to involve the supposition that you're dealing with people and there's even... There's some laboratory work on this finding that sometimes when you view somebody as morally wrong, or a morally bad individual, you, in fact you exaggerate their human traits as if to make it sort of more... Their evil-ness more salient and to give you more of an invitation to hurt them.
0:47:38 SC: Can you give me an example of that?
0:47:40 PB: Yeah, this is work by Tage Rai and his colleagues, very nice research where you give people scenarios where you ask them to do different things to people. It's all hypothetical pen and paper stuff. So in one condition you say, "What if I gave you a large sum of money to break this guy's thumb?"
0:48:02 SC: Right.
0:48:02 PB: And another condition, so imagine that, imagine so much money, you can't help but do it. And then another condition is, this guy is a serial rapist, give a great detail. Now, suppose you decide on this grounds to breaks his thumb. If you're willing to do it it shifts your feelings towards him in different ways. So if your hurting him is instrumental, you'll tend to dehumanize him, you'll say, "Oh it doesn't matter. He's not much of a person anyway." But if your hurting of them is sort of moralistic, and Rai does studies moralistic violence, you tend to accelerate his human traits. Similarly, if you wanna get somebody to hurt somebody for money, you'll do better by saying, "Oh they're just a thing. They don't feel much."
0:48:51 SC: Yeah.
0:48:51 PB: But if you wanna get somebody to hurt somebody else in response to something that that person did, if you focus on their moral traits, you'll do better. So you know... And again, this has political resonance. Think how you would act if you want people to lash out on immigrants, what you would say about immigrants? And it depends to sort of if you want them to ignore the immigrants or just treat them as dispensable or disposable, you'd say one thing. If you wanted them to really hate them, you would say another.
0:49:25 SC: Right. And so what does this teach us about human psychology? I mean in some sense is it a little bit too glib to say, "We don't wanna go around hurting people." So, if just for instrumentally we're paid money, we wanna de-humanize them. But there's this countervailing punishment reflex or cruelty reflex.
0:49:46 PB: Punishment, moralization. A lot of people have pointed out that when we do... When we're cruel to one another, we don't think of ourselves as villains, we don't say, "Oh my God, I'm such a terrible person for wanting that person to suffer." Rather we're often proud of ourselves. We think of ourselves as moral agents. You... Again, you see this in the political realm, you see the incredibly... I'm on Twitter a lot, too much, but I just...
0:50:16 SC: Well, as an experimental lab for cruelty, you know it's a good place as any.
0:50:20 PB: It's all data. And you see the extraordinary nastiness directed towards some groups of people against other groups of people, because they're people, the hatred directed towards... I don't know. Social justice warriors on the one hand versus men's rights activists on the other hand.
0:50:38 SC: Right.
0:50:38 PB: And the people directing this hatred and cruelty don't think, "God I'm such a terrible person for doing this." Rather, they're proud of themselves because Harvey Weinstein or Donald Trump, or Bernie Sanders or whoever this target is, they have it coming, they're bad people and they have it coming. And what this suggests is in general, and then which I think is true is that our moral feelings towards other people have multiple consequences. It drives compassion and love and kindness. It also drives cruelty and reprisal and punishment. And so seeing somebody as a person has all sorts of consequences.
0:51:23 SC: Yeah, is it... So do we conclude that in some sense we have natural tendencies, proclivities toward reward and punishment with respect to other people? We think some people should be rewarded, some people just should be punished as the right, valorous, just thing to do.
0:51:38 PB: We do, and I think this is universal. And in my own day job I've done studies with children and babies and we have research suggesting that by a child's first birthday, they have some moral intuitions, they distinguish good guys from bad guys, they favor good guys over bad guys. You look at about 18-month-olds, 18-month-olds will reward good guys and they'll punish bad guys by giving treats or taking away treats. A few years later you get to emergence of so-called altruistic punishment where they'll actually give up resources to make another one suffer. And there's big cultural differences in this, and it changes across in age and gender and society, but I think what you're talking about, the desire to reward the good, and punish the bad is actually part of our human nature.
0:52:29 SC: And it's not just a casual cruelty to people we meet that it has... It sounds like it has important implications for the criminal justice system for example.
0:52:38 PB: It does, it does. And so, take one example. There's a lot of people who think victim's impact statements are wonderful things where the victim talks about the trauma he or she has had with the death of... Their being raped, or being assaulted or the death of a child or something, and then the jury or the judge now, based on that determines the punishment. And I could see... I know why people favor this, giving the victim a voice is very powerful, but this seems to have all the worst features of empathy and/or punitive desire.
0:53:12 SC: Right. So you...
0:53:13 PB: Among other differences, that means if the victim is an attractive, articulate person, there'll be a greater punishment towards the perpetrator than if the victim is ugly and sullen.
0:53:23 SC: Right. So yeah, this is very much along this theme of the relationship between reason and emotion, the victim's statement is blatantly appealing to emotion. It's not saying, "Well this person did the following things that we'll plug into an algorithm and figure out how to treat them in the criminal justice system" it's saying, "No, what they did is really emotionally resonant, so they should be treated worse." And what you're saying is, "That's no way to run a justice system."
0:53:53 PB: I am. That's exactly right. Just like I think empathy is no way to guide who we're kind to or what we do for charity, our punitive impulses would often feed off of empathy, are a terrible way to run a criminal justice system. For all sorts of reasons of bias depending on who the individual is but also because the question of how much something offends our gut is actually not a really good way to decide how to meet out punishment.
0:54:21 SC: Right. And I think here, I'm more or less completely on board. I think the anti-cruelty is an easier sell than the anti-empathy bandwagon. But just for purposes of conversation, let me push back a little bit. Yeah we did start by saying there is this relationship between emotion and reason, isn't there something to be said for letting some of our emotional resonance into our decision-making process even if that emotional resonance is this person is bad, they should be punished, they should be treated badly?
0:54:55 PB: Yeah. I think it's sort of a question of where and when. I don't think much of the claim. I don't think it's what you're arguing for, that, because of our emotions about somebody who did a horrible crime, prison rape becomes this amusing matter to us, where an idea of somebody being raped in prison is a source of amusement and pleasure. I think that that that's sort of allowing us to exercise the worst aspects of ourselves. On the other hand, I'm not... I think you're raising an important point, which is, to have a rich life is to have a life that includes emotions. And emotions play an important role, regulating our feelings towards others. So I'm not against being biased towards your friends, over other people, your family or other people. I'm not against feeling empathy for those you love in sort of intimate, personal ways or... And I'm not even against some degree of retribution, or punitive aspects towards people in everyday interactions. If you write a savage review of my book...
[laughter]
0:56:04 PB: Well, well, you know, we are humans.
0:56:06 SC: String 'em up.
[laughter]
0:56:07 PB: Well, okay. Well, you said that, but yeah.
[laughter]
0:56:10 SC: I've had book reviews, I know what it's like. [chuckle]
0:56:14 PB: That's where I go. Just for a little while, you could exercise the fantasy of...
0:56:17 SC: Yeah. Proportional response.
[overlapping conversation]
0:56:19 PB: Yes.
[laughter]
0:56:21 SC: Well, can we learn anything... So there's a realization that an element of being cruel comes not from dehumanization but from humanization. Is that actionable intelligence? Can we learn to be less cruel people by appreciating that fact?
0:56:40 PB: I think so. I think in general, I didn't often be quite skeptical about what psychology tells us about how to live our everyday lives. But I think this is one case, where just simply knowing it makes a difference. And this comes up in cases. It actually has come up in issues of international relations, where it used to be thought, for instance, that, if you only knew... If the Israelis only knew what it was like to be Palestinian and Palestinians only knew what it was like to be Israelis, then peace would reign and so on. And there's been sophisticated psychologists that looked at this and often find that increased intimacy and understanding of other people's minds and what goes on in their, often makes things worse and not better.
0:57:25 SC: Yeah.
0:57:27 PB: If you really know what it's like to be in the head of somebody who hates you, it's not a good thing.
[laughter]
0:57:32 PB: And maybe treating them as a little bit less of an individual and more as a problem to be solved and somebody in a negotiation partner, is actually better off. I think it teaches us in general to be a bit more conscious about the idea that getting close to another person will make us kinder to them.
0:57:53 SC: Well, I certainly do think having had some conversations about the nature of morality and justice and things like that, plenty of people are willing to just stop being rational about these things 'cause they know that's just wrong, that's just right. I don't need to justify it anymore. And I'm certainly on board in taking a moral behavior or just behavior as being subject to rational analysis just like everything else.
0:58:20 PB: That's right. And it's complicated in all sorts of interesting ways. For one thing, I think, in the end... And you were talking about this when we talked about Hume at the beginning. There's gonna be some moral intuitions, just like there's gonna be some scientific intuitions that are sort of rock bottom. If you don't agree with me that hurting people is wrong, it's not clear we have much more to talk about.
0:58:42 SC: Right.
0:58:43 PB: And... But if you say to me, "Well, we gotta keep... We're gonna have open borders." And I say, "Oh, why do you think that?" And you just say, "It's a fundamental moral view." And you stomp your feet. Well, that's not very productive.
[laughter]
0:59:00 SC: Yeah. It doesn't sound like something that should qualify as a fundamental... Even if it's the right view, whether or not...
0:59:06 PB: That's right.
0:59:06 SC: It's not that fundamental. It should be derived from something a little bit more.
0:59:09 PB: I've had arguments about charter schools.
0:59:13 SC: Yeah.
0:59:13 PB: For some reason, this is a fundamental moral position. And I say, "What? Really?"
[laughter]
0:59:17 PB: And in some way, if you think of the function of political moral arguments, sometimes when people do that, it's not that they... They appreciate that you could actually argue about these things. What they're saying is, it's immoral to do so.
0:59:31 SC: Yeah.
0:59:31 PB: What they're saying is actually, given the kind of person I am, I wanna be seen and see myself as somebody who crossed the line here.
0:59:40 SC: Well, you noted earlier that you were not trying to take a certain stance toward what is the once and for all correct moral theory, right? But can I just ask, for textual purposes, would you qualify yourself as a moral realist? Do you think there is ultimately right and wrong? Or is it more constructed by individual people and societies?
1:00:01 PB: Yeah, I'm definitely a moral realist. I think that there are facts to the matter as to what's right and what's wrong. Now, I don't think moral truths are the same as physical truths.
1:00:11 SC: Right.
1:00:11 PB: I think if there were no people, there'd be no morality. If there was no... If people were constituted differently, our morality would be differently. But given how people are constituted, I think there are things you would say are moral truths, or at least moral universals. So...
1:00:21 SC: Yeah.
1:00:26 PB: So, you know, there's a lot of philosophy here that some of it, that goes meta ethics. Some it goes over my head, but I would think that there are, you could point to another culture of humans and say, "They're doing things badly." And if they came up to you and said, "Well, we agree, we've all, all of us, we like slavery, or we like the oppression of women." I think one could say, "Well, you're just mistaken. If you thought harder, you'd realize you shouldn't be doing things that way."
1:00:54 SC: Yeah, it's tough to say because it's like you already admitted. It's not the same kind of mistake as if they thought that the earth was the center of the universe. It's not a mistake you can disprove by doing some experiment.
1:01:05 PB: That's right, that's right. So it is harder, you can't just say, "Well, if only you had better telescopes or paid more attention and thought things through, you would simply know this." On the other hand, and here I'm kind of borrowing an argument from Sam Harris, who I don't fully agree with on this but his argument strikes me as good, which is because they're human and you're human, you guys share certain premises. Let's say that unnecessary suffering is bad. And you could sort of, if you spend enough time with them and they're rational, which they would be given enough time, you could persuade them the way you're doing things. Say slavery doesn't work, it doesn't respect what you think should be respecting, it doesn't maximize what you think should be maximized. So I'm not coming in as a 21st century American saying, "You're doing things wrong by my lights." Rather what I'm doing is I'm saying to them, say Nazi Germany or whatever, I'm saying, "You're doing things wrong by your lights. If you thought about this long enough you'd realize that the way you're doing things violates your own intuitions of how it should be done."
1:02:22 SC: Yeah, we're not gonna solve this here and it's a little bit off topic but I just need to get into my disagreement with Sam about exactly this point which is, I think it's kind of, it might very well be true, and as a matter of practice, I'm very willing to believe that it is true that I can reason with very different people doing very different things on the basis of common moral intuitions. But I think it's a huge mistake to then leap from that to say, we need to be more realists. We don't need to be more realists, we just have to find that common ground and reason from there. And I think that as a matter of ontology, I'm not a moral realist. I think that you can just accept the reality that morals are constructed by human beings, and go from there.
1:03:04 PB: For the purposes here, I will agree with that.
1:03:07 SC: Yep. [chuckle]
1:03:08 PB: Moral realism is another step.
1:03:10 SC: Exactly, that's right.
1:03:12 PB: And I'm tentative about making that step but at the same time, you had sort of asked whether I'm a moral realist or moral relativist. Moral relativism is typically associated with the position that, well, if enough people if in a society are happy with some way of doing things then that's fine by them. And you and I are both universalists in a sense that we think there's a fundamental human nature so some groups can do it right, some groups can do it wrong, that's enough for me.
1:03:40 SC: Yeah, like I said, we're not gonna go... I'm not even a universalist, I'm a constructivist, but that's okay. It doesn't make any difference to how you act in the world, is what is my only point right now. I do...
1:03:51 PB: As long as we can disapprove of other people.
1:03:53 SC: That's right, those people, boy. Until I have them on the podcast and then they'll get their say. But speaking of Sam, you wrote a very provocative... Just to wrap things up, you wrote a provocative little piece with him about robots and artificial intelligence. Let's start torturing the robots. Is that a bad thing? Should we extend some of the considerations we've had here to non-biological intelligences? Where do you come down there?
1:04:19 PB: Yeah. It was a hugely fun article with Sam. It grew out of actually a podcast discussion I had with him and it was focused around Westworld, the example of Westworld, which was showing at the time we were talking, we were both watching it. And I'm sure you know but so many, some people don't that the story of Westworld, originally a movie then an HBO series is there's in the near future, they create this wild western fantasy which you could visit and it contains incredibly realistic robots who are indistinguishable from people and some of the guests take advantage of this to kill, to rape, to torture and so on. And so...
1:04:56 SC: Yeah. Because they're just robots, right?
1:04:58 PB: Pardon me? Yes that's right.
1:05:00 SC: Because they're just robots, who cares?
1:05:00 PB: That's right. That's right. And so you describe... It was a fun article to write, surprising uncontroversial. You never know, but what we said is, "You shouldn't do that, you shouldn't, you shouldn't rape and torture and kill robots, like that." And our argument was twofold, one part of it was that, you don't know they might be conscious, and if they were conscious and could feel pain, then you definitely shouldn't do that for the same reason you shouldn't do of a person. And then the second part of it is, even if you could know for sure that they weren't conscious, it seems likely to us at least that doing this to things that were indistinguishable from people would affect your relationships with people. So it's like Kant's position about...
1:05:51 SC: Yeah.
1:05:51 PB: Being cruel to animals, which is he thought animals... There's nothing wrong with being cruel to animals, but said if you were it would make you cruel to people.
1:06:00 SC: Right. I mean these are two very different arguments that you're presenting, so.
1:06:04 PB: The first one I take it is not very controversial.
1:06:09 SC: Yeah, but although you made me think that I really missed a chance when I interviewed David Chalmers on the podcast 'cause we talked about two topics which I considered to be more or less separate. One was the nature of consciousness, and the other is, "Can we live in a simulation?" And he thinks that consciousness is not reducible to the physical, that there's a mental properties over and above physical properties. But he also believes that the things that happen in a simulation are just as real as things that happened in the so-called real world.
1:06:37 PB: Yeah.
1:06:37 SC: But I didn't quite prod him on whether or not there was tension there. If I simulate an intelligence but without the consciousness, awareness properties that we human beings have, is it still torture to do bad things to it, and I think you're gonna be pretty down to earth and say, "Sure, it's just as just as bad."
1:07:00 PB: Yeah, I would say, I would say there's often a fact of the matter. I mean, if I... I don't know. If I put a happy face on my laptop and scream at it, and try to verbally humiliate it, I'm just doing something foolish, I'm not doing something wrong. But the same if I yell at my Alexa or my Siri, as many people do. But so there's a fact of the matter as to whether it's in its consciousness, conscious or not. And then the second part of the argument is that even if it isn't, if it's perfectly... I'm sorry, my Alexa just spoke up.
[laughter]
1:07:41 PB: Here we go. Even if it's not conscious, it would still be bad. And this is kind of a funny argument for me to make 'cause I think that the claims that violent video games make people worse and violent to humans is actually, are actually mistaken. There's so little evidence for it and so much evidence against it.
1:08:01 SC: Right.
1:08:02 PB: On the other hand, if you could go and beat a child to death with your hands knowing it's not a child, but a machine...
1:08:12 SC: Yeah.
1:08:12 PB: That's indistinguishable from a child, what would that do to you? And I think the answer is nothing good.
1:08:17 SC: Well, and I think that it's, Westworld is a wonderful example to use. But I think that it's very realistic that in the very short term, this kind of problem is gonna show up, right. We have sex dolls that are looking very realistic, we have things like Siri and Alexa that can talk to us and express emotions even if they're not conscious by anyone's definition. The idea that it seems to us just like we are torturing or harming something that is like a person is gonna become frighteningly plausible.
1:08:52 PB: It is gonna it is gonna happen sooner or later. The trend is gonna go towards increasingly complex and realistic machines, including sex robots. And what seems like kind of a fun, goofy science fiction example now could be quite serious in a decade now. I mean would you put an estimate on how far we are from a machine that's indistinguishable from a person?
1:09:20 SC: I think that this that's a great question. And it depends a lot on how careful you are about distinguishing. [laughter] I think that we're not that far from machines that you could be tricked into thinking are human if you didn't try that hard, right. I think we're there. I think that there's been these artificial intelligence programs that can do call center kind of things and ordinary humans don't tell the difference, but if you really put your mind to it, in trying to give them the Turing test and they would still, I think we're still very far from being able to pass that.
1:09:50 PB: I think so too, I think it's much harder than people say, and I think it just maybe possible as with self-driving cars. It's the last 5% that's gonna kill us.
1:10:00 SC: Yeah, that's right. Good, so let's be nice to the robots, let's be nice to each other, and let's empathize with some people in the right circumstances, and mostly be driven by our rationality, how about that?
1:10:13 PB: I think that's a fine way to end.
1:10:15 SC: Alright, Paul Bloom, thanks so much for being on the podcast, this was very helpful and a lot of fun.
1:10:19 PB: Thanks for having me, this was great.
[music]
Don’t we have an empirical test for this? Autistic people have very little to no empathy. Are they terrible people?
I haven’t read the book, but based on the interview, I agree with Professor Bloom. From my point of view empathy is being bought at the costs of truth and of competence.
In 1992, a candidate for the Democratic presidential nomination was running around New Hampshire and Iowa shouting “I feel your pain.” My reaction was “Bull shit”, but Democrats ate it up and Bill Clinton won the convention delegates and the nomination. That was my first experience with empathy and I hadn’t learned the word yet. (I was 42.)
Then there was the movie “Adam”, where the title character has Aspberger’s Syndrome, the total inability to see things from another’s point of view. I thought that Adam was perfectly healthy and the rest of his world needed to be cured of the propensity to lie.
I have had rather serious issues with the incompetence of the clerks in physicians’ offices. Hiring practices, it appears, put a high value on empathy and a low one on competence and I have learned to find the competent person in an office, if one exists, and to direct all of my comments to that one person.
Bloom’s book is now on my reading queue.
Pingback: Sean Carroll's Mindscape Podcast: Paul Bloom on Empathy, Rationality, Morality, and Cruelty | 3 Quarks Daily
People on the autism spectrum have been thought to have a deficit in cognitive empathy (perspective-taking), not affective (emotional) empathy. Paul Bloom’s book is mainly about affective empathy. I’m currently writing a lengthy blog post on empathy as it was a large part of my doctoral research, which considers Bloom’s book at length (it will be longer as a result of listening to this podcast episode!). But one benefit of critically analyzing empathy as a moral emotion is that it may lead to people no longer equating empathy, broadly, to morality, and also that people with autism are, broadly, deficient in empathy, thereby suggesting that they are also deficient in morality.
Thanks for that, entertaining podcast! Still not much in the way of actual disagreement, maybe one of these days..
Minor correction on the bat and ball puzzle. It is typically presented as:
“A bat and ball together cost $1.10. The bat costs *$1.00* more than the ball. How much does the ball cost?”
Great Podcast, thank you both.
On the topic of cruelty
Paul: I’m on Twitter a lot
Sean: It’s an experimental lab for cruelty
LOL
A recent small attitudinal study conducted by Amaze (the peak body for autistic people and their families in Victoria, Australia) found that while 29% of people in the general sample surveyed thought they had a good understanding of how to support autistic people, less than 2% of the autistic respondents agreed (Jones et al, “Community Attitudes & Behaviours Towards Autism; and Experiences of Autistic People and their Families” 12 December 2017). Sobering stats indeed; some of the comments responding to this very interesting discussion on empathy certainly reveal how widely autistic people, and autism more generally, are misunderstood in the mainstream community.
Sean, it’s interesting to observe how often autism comes up in the discussions with your guests (I found much food for thought in Lisa Aziz-Zadeh’s observations about embodied cognition in this respect). It would be fascinating to hear a thoughtful discussion about neurodiversity (ie one that didn’t get caught up in the culture wars) and what embracing it might mean for intellectual progress, especially in areas like theoretical physics, which appear to the outsider to be dominated by orthodoxies and cliques of thought.
Empathy may be problematic for many, but it is absolutely essential for those in power, especially elected leaders. Presidents and members of Congress who lack empathy have little chance of improving the lives of families living below the poverty line. Too many politicians are admirers of the economic policy advocated by Ayn Rand who advocated the cruel philosophy of individual selfishness, and denied that government had a legitimate role in improving the economic welfare of United States citizens. Without the empathetic Eleanor Roosevelt, who used her influence with her husband, and Democrats in Congress, it is doubtful that the safety net of Social Security could have been passed in 1935. Although she argued against Congress establishing this vital program, she applied for, and was granted, much needed S.S. benefits in her old age.
It seems to me that sucessful politicians today say in so many words, I feel your fear. And oh by the way, I can protect you.
Pingback: An Incongruent Evolution | The IdleMind Stirs