Social Science Bites

Alex Edmans on Confirmation Bias 

April 2, 2024 6030

How hard do we fight against information that runs counter to what we already think? While quantifying that may be difficult, Alex Edmans notes that the part of the brain that activates when something contradictory is encountered in the amygdala – “that is the fight-or-flight part of the brain, which lights up when you are attacked by a tiger. This is why confirmation can be so strong, it’s so hardwired within us, we see evidence we don’t like as being like attacked by a tiger.” 

In this Social Science Bites podcast, Edmans, a professor of finance at London Business School and author of the just-released May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases – And What We Can Do About It, reviews the persistence of confirmation bias — even among professors of finance. 

“So, what is confirmation bias?” he asks host David Edmonds. “This is the temptation to accept something uncritically because we’d like it to be true. On the flip side, to reject a study, even if it’s really careful, because we don’t like the conclusions.” 

Edmans made his professional name studying social responsibility in corporations; his 2020 book Grow the Pie: How Great Companies Deliver Both Purpose and Profit was a Financial Times Book of the Year. Yet he himself encountered the temptation to both quickly embrace findings, even flimsy ones, that support our thesis and to reject or even tear apart research, even robust results, that doesn’t. 

While that might seem like an obviously critical thinking pitfall, surely knowing that it’s likely makes it easier to avoid. You might think so, but not necessarily. “So smart people can find things to nitpick with, even if the study is completely watertight,” Edmans details. “But then the same critical thinking facilities are suddenly switched off when they see something they like. So intelligence is, unfortunately, something deployed only selectively.” 

Meanwhile, he views the glut of information and the accompanying glut of polarization as only making confirmation bias more prevalent, and not less. 

Edmans, a fellow of the Academy of Social Sciences and former Fulbright Scholar, was previously a tenured professor at the Wharton Business School and an investment banker at Morgan Stanley. He has spoken to policymakers at the World Economic Forum and UK Parliament, and given the TED talk “What to Trust in a Post-Truth World.” He was named Professor of the Year by Poets & Quants in 2021.  

To download an MP3 of this podcast, right-click this link and save. The transcript of the conversation appears below.


David Edmonds: What do you think about Social Science Bites? Obviously, that it’s the world’s most interesting podcast on social science. Now, suppose you were to read an article that purported to back that up with evidence. Well, given your prior belief, you might be more inclined to believe it, and more inclined to dismiss an article that came to the opposite view. This is confirmation bias, and Alex Edmans of the London Business School has become so concerned about this and other biases that he’s written a book about it, May Contain Lies. Alex Edmans, welcome to Social Science Bites. 

Alex Edmans: Thanks, David. It’s great to be here. 

Edmonds: The topic we’re talking about today is confirmation bias. This is part of your research on misinformation. But this entire research project seems to be a departure for you. You’re a professor of finance, and I think it’s fair to say that most of your research has been so far on corporate governance. 

Edmans: And that’s correct. So I never set out to be a professor of misinformation. So most of my work is on social responsibility. It comes under different names like ESG, or sustainable finance and I believe I was one of the early pioneers of the idea that companies that are good for society also deliver higher shareholder returns. So one of my early papers found that companies that treat their workers well create a great corporate culture, they do better than their peers, and that’s how I got into the idea of sustainable finance.  

Edmonds: So treating their workers well means what? Just having a nice atmosphere at work, paying them better than the market rate? What counts as treating your employees well? 

Edmans: So, what I measured was inclusion in the list of the 100 Best Companies to Work For in America. So that is a list compiled after surveying 250 to 5,000 employees at all levels, and asking them 57 questions on issues such as credibility, respect, fairness, pride and camaraderie. So certainly quantitative factors, like pay and benefits will affect that, but also qualitative factors such as, does the boss treat you as an individual rather than a factor of production? Do they give you opportunities to step up and to develop within the organization? So all of those intangible factors also matter, and what I found was that companies that did treat their workers well, they did outperform. So it’s not that they’re being fluffy or woke, they are being commercially sensible investing in the most important asset, their people.  

Edmonds: That’s a lovely result. That’s the result we want to hear. We want to believe that companies that treat their employees well, treat the planet well, do better. Is it also true that diversity of employment helps, having different ethnic groups represented on the board is good for the company? Because I know some people have done some research on that claim that that is the case. 

Edmans: Yeah, so that’s how I got into the topic of misinformation. So being seen as one of the early pioneers on sustainable finance with that paper, then I learned of other papers, which also seem to give similar results that if you do good stuff, then you do better as a company. And one of the things which is good stuff is to be a diverse company. And that’s something that I would love to believe, as somebody who believes in the importance of sustainability. Personally, maybe you don’t get this on the podcast, but I’m an ethnic minority, so I have personal reasons to want that to be true. And there’s tons of studies out there by luminary organizations such as McKinsey, or BlackRock, even a regulator like the Financial Reporting Council, claiming there’s a clear link between board diversity and financial performance. But when you look at those papers, there’s a huge amount of flimsiness to this, the evidence does not at all support the conclusions. And just to give you an example of one study, it claimed to find a strong result — it did 90 different tests. None of those tests gave a positive result. But they just lied in the executive summary, they claimed a result that just wasn’t there. And people just accepted it without checking the tables at the back of the report, because they wanted it to be true.  

Edmonds: Lied is a very strong term. So it wasn’t just that they wanted to believe it like all those who read the report, it was actually that they were deceiving their readers.  

Edmans: And that’s correct. So they misrepresented the results. So you might think there’s different forms of misinformation. One form you might think is you just conducted the test in a sloppy way, you found a positive result, claimed a positive result, and that result was sloppy. Here, there is actually no result to begin with. So, the results did not find a positive result and they claimed to have found that and so that is a different level of misinformation. And this highlights how misinformation is such a strong issue. Yes, we can quibble about whether the methodology is correct, did you measure diversity the correct way? But actual representation of your own results is something that people might be deceiving the readers with. 

Edmonds: But this piece of research was so compelling because so many people wanted to think it was true. 

Edmans: That’s correct. And this is the big issue of confirmation bias. So, what is confirmation bias? This is the temptation to accept something uncritically because we’d like it to be true. On the flip side to reject a study, even if it’s really careful, because we don’t like the conclusions. So, this study, this idea that diversity improves performance, there are lots of well-meaning people who thought, ‘Yeah, that is just intuitively true. Well, a diverse team makes better decisions and so in this report by this famous organization finds this conclusion, it must be true. I don’t even need to look at the table. I’ll believe them. I’ll shout it from the rooftops. I’ll write headlines myself.’ And some, again, some respected organizations wrote headlines about the study. Those headlines were wrong. They spread misinformation. Why? They wanted the result to be true. 

Edmonds: You cite various pieces of evidence to show that confirmation bias exists. One piece of evidence is neuroscientific. Tell us about that. 

Edmans: Yeah, so this is a great set of studies done by some neuroscientists. And so, what they did is they took some students, so they hook them up to an MRI scanner to scan what happens in terms of brain activity. So, what they did is they first gave them some statements, which were nonpolitical, such as Thomas Edison invented the lightbulb, and then they came up with some evidence to contradict that. And then nothing really happened to the brain, there wasn’t so much activity, nothing to write home about. But then, when they gave a political statement, which they knew the student agreed with, such as the death penalty should be abolished. And then they gave some evidence to contradict that, then the brain did light up. And the part of the brain that lit up was the amygdala, that is the fight-or-flight part of the brain, which lights up when you are attacked by a tiger. So this is why confirmation can be so strong, it’s so hardwired within us, we see evidence we don’t like as being like attacked by a tiger and that’s why people get defensive on social media, they might retaliate and take something personally, if somebody presents a different viewpoint. 

Edmonds: Does it help tackle confirmation bias to be presented with different sources of information? If I think Brexit was a disaster — and I come across newspaper articles suggesting that actually the economy has done much better than expected — does that help counter my bias? 

Edmans: You might think so because you think the issue is misinformation. Why don’t I just gather more information? But the idea of confirmation bias suggests it’s more difficult than that, because what matters isn’t just having information, but whether you take that information seriously, or do you accept it uncritically?  

And so there’s another interesting study where some other researchers, they took some different set of students who had views on the death penalty. And they gave them two studies, one of them supported the death penalty, and another opposed it and by supported or opposed, because they looked at what happened to crime after the death penalty was introduced. And so they asked the students to rate the accuracy of those studies. And if you were a death penalty supporter, then the pro death penalty study, you would just accept it and say, ‘This is a great methodology,’ and the one that opposed it, you were trying to pick every single hole within it. And importantly, those studies were exactly the same quality. How did the research know it because they were randomized, they took the same study, and they just presented one set of studies to a death penalty supporter, the same pair of studies to a death penalty opponent. And so how all the reactions were depending on was entirely your prior viewpoint, not the study’s quality, because those were the same studies in both cases.  

And so what this led to was something called belief polarization. So you might think if we put all the facts on the table, we all see the same information, then you, a Brexiter, and me, a Remainer, we will agree. But it’s actually the opposite, because I will find the information that is pro remain, and I will latch on to that. And you’re going to find the information that’s pro Brexit, and you’ll lap that up. So actually, more information can lead us to being less informed if we’re not aware of our biases, because more information just allows us to find something that supports our viewpoint. 

Edmonds: I love the study that you cite about gun control. Can you just explain that and it’s very relevant in the age of the internet, when we can quickly find pieces of information or facts to back up our argument or reject an alternative view. 

Edmans: Thanks for bringing that up, David, because the types of confirmation bias we’ve discussed so far, is what I call biased interpretation. Once you have information, you interpret it in a biased way, I latch on to something I agree with and I reject something I disagree with.  

But there’s a second type of confirmation bias which is biased search. What types of information do we go out and actively search for to begin with? So this study took again some undergraduates and they had undergrads with strong beliefs about gun control, and they gave them different types of information that they could obtain in order to research the issue. Now importantly, they were told to research it in an even-handed way so that they could explain the gun control issue to other people. But what they found was that the information that they chose to search for was strongly correlated with their prior views on the issue. So there was some information from citizens against handguns, and also some other information from the National Rifle Association. Now, if you’re a gun-control supporter, guess which one you’re going to look at. You’ll look at Citizens Against Handguns, and you won’t even bother to hear what the National Rifle Association has to say. And David, you’re absolutely right, this is relevant right now, because we’ve got so many sources of information out there, but we also know which information will trigger our amygdala and which information will make us happy and calm. So if I’m a right-winger, I might look at Fox News, and I’m not going to read The Guardian. And if I’m highly liberal, then I will do the opposite. 

Edmonds: So a range of sources of information doesn’t help. Let me try something else. What about expertise? Suppose that you are an expert in Brexit, you have a PhD in Brexit? Will that help reduce your confirmation bias if you see arguments that oppose the position that you’ve taken? 

Edmans: Unfortunately, that might not help either. So there were further studies which look at the effect of intelligence on whether you’re likely to respond to information in a correct way. You might think, well, if you’re intelligent, don’t you learn at university you need to consider both sides. But actually, there’s a twist, the more intelligent you are, the greater your ability to poke holes in an argument that you don’t like. So smart people can find things to nitpick with, even if the study is completely watertight. But then the same critical thinking facilities are suddenly switched off when they see something they like. So intelligence is unfortunately, something deployed only selectively.  

And I see this myself, because I’m quite active on LinkedIn, I share lots of studies on sustainable finance. That’s my field. And importantly, I try to share studies which support the idea of sustainable finance, and also those that oppose it, even though that opposes my own types of research, because what matters to me is, is the research good? Now have I posted a study, which has an inconvenient truth, like diversity doesn’t lead to better performance? You have no shortage of people trying to poke holes in it. Is this correlation? Or is this causation? Is the time period too short? Was the measure of performance, exactly the one that they should have used? But do I see the same critical thinking faculties applied when I posted a study that people liked the sound of? No, they just lap it up, uncritically. 

Edmonds: It sounds like you’re suggesting that your expertise or expertise of people genuinely far from not helping, might actually make confirmation bias worse 

Edmans: It can do if you’re not wary of your biases yourself. Because if you think you’re an expert, you might think that anybody who contradicts you, they just don’t have the same level of expertise as you, and therefore you might dismiss them and we see this in some large organizations, often as a CEO, you’ve risen to the top, there might be people who are suggesting there’s concerns with the strategies you’re pursuing. And you just say, ‘Well, you don’t know as much as me.’ You might know the book, The Big Short, where they talk about a Morgan Stanley head of proprietary credit, who allegedly led to $9 billion of losses and when somebody disagreed with him, he would allegedly say, ‘Get the hell out of my office.’ Why? Because he thought he knew everything. He was a great bond trader and if people were expressing concerns about his risk position, he would not believe that his own instincts were wrong.  

Edmonds: Now, you might not be able to answer this. I don’t know if there’s any data on it. But do you think our susceptibility to confirmation bias has got worse over time? 

Edmans: I do think so for a couple of reasons. So number one, we have so many sources of information right now it’s easy to handpick whatever source confirms our viewpoint. In the past, like when I was young, you knew what the sources of information were. I had to trek out to the library, look up in the encyclopedia, and that was seen as the authority, and it would be people who know that topic who write in the encyclopedia. Right now, anybody can post something on LinkedIn, there’s some really best-selling books written by people who might not have expertise in that topic but because those books give messages that people want to hear, they are books, which are our laptops, so it’s really easy to find something that you like. 

Another issue that I see is just more polarization. So now, what people believe might not just be what they think is true based on the evidence, but what they would like to be seen as standing for. So if you’re somebody who wants to be seen as a pro-diversity person, that does get you quite a lot of friends, and it does get you a lot of followers. And so if you want to be this pro-diversity person, you want to only believe in evidence which supports your viewpoint, any sort of equivocation might seem to be something which is weak. We often hear about politicians who do U-turns and that is said to be a bad thing. To me, a U-turn is responding to evidence and changing your view when evidence changes. But this idea of being nuanced is seen as flip flopping, it’s seen as weak, so just having a very strong position is something that can get you to the top.  

Edmonds: I’ve made various suggestions about what might combat confirmation bias, and you’ve ruthlessly knocked each and every one of them back. So what does work? 

Edmans: I think the first thing that works is just to be aware of confirmation bias to begin with. So when people think about issues such as, say, addiction, they say, first, recognize that you have a problem. That’s the first way in order to try to address this. And I think for me, myself, I know that I suffered from confirmation bias. That’s one of the reasons why I wanted to write the book and why I looked into the topic is I recognize that these are things that I need to try to address myself.  

And then after recognizing that this is an issue, then we want to see well every bit of information. Are we responding to this because we like it or dislike it? Or is it based on the actual quality of the results? And so one question that I encourage people to ask is if they find a study, and they really want to believe it, imagine it found the opposite results. And you would want to take it apart. What questions would you ask? So I found a study showing that diversity worsens performance, I might say, ‘Well, actually, that study was only based on two years of data, we need to look at 10 years. ‘ Or maybe they measured performance looking at profitability. But that’s not a good measure because that could be very short term as I want to look at long term returns or something like that. And then once we see the holes that we would poke if we didn’t like the result, do those concerns also still apply, even though the result is something we would like and that would make sure that we are not responding just based on the result, but based on the methodology itself?  

Edmonds: You mentioned you might suffer from confirmation bias when it comes to governance. What other areas are you susceptible to falling for confirmation bias? 

Edmans: I think any types of research which might support the research that I’ve done, so if there was research suggesting that employee satisfaction improves performance given that’s linked to my own work, I might be willing to believe that other research suggesting that corporate governance improves performance. Similarly, in the Brexit/Remainer referendum several years ago, I was a strong Remainer, and so I would like to believe that any argument for Brexit was based on misinformation, was believing the back of a bus. And so how did I try to combat that? I tried to actively seek the viewpoints of Brexiters, who I think I would respect. So there was a talk given by Roger Bootle, who was a supporter of Brexit, also a very respected economist, he runs Capital Economics, an economics consultancy. And he was giving a talk and luckily for me, this was hosted by Merton College, Oxford, where I’m an alum. This was an alumni event. And I went along. Yeah, partly because it was the Merton alum event and there was going to be some free canapés and champagne. But also, I thought, ‘Well let me listen to this open mindedly,’ and I was shocked that I thought his arguments were really well put together. And I didn’t agree with every single thing that he said, but at least I saw there was a logic behind it. It wasn’t just xenophobia, or misinformation, which people like me had been led to believe about the other side. And then when I got home, I researched on the points first, and I thought, ‘Yeah, this actually seems to stack up.’ And to me, this was the most eye-opening talk that I’ve ever seen in my life. I then wrote up on my blog, the case for Brexit. So I had both, I had the case for Brexit and the case for remain. But I wanted the readers of my blog to see that there are actually two sides in an argument that people like me had previously wanted to portray was only one sided. 

Edmonds: If you were to study on confirmation bias showing that it was a great problem and getting worse, would you be more likely to believe it? 

Edmans: I probably would, because that’s what I believe and that’s what I’ve suggested in my answers. So then what would I have to do? I’d have to ask the same questions myself. I’d have to look at, well, is this something where they looked at correlation versus causation? Have they got a good control group? And this is why in the studies that we’ve mentioned so far on this chat, I do believe that they have good data where we have the studies on death penalty, where they took exactly the same studies and presented the same pair of studies to do different people. So any differences in the interpretation were due to people’s biases, rather than the studies being of inherently different quality.  

Edmonds: Alex Edmans, thank you very much indeed. 

Edmans: Thank you so much, David. It was great to be here. 

Welcome to the blog for the Social Science Bites podcast: a series of interviews with leading social scientists. Each episode explores an aspect of our social world. You can access all audio and the transcripts from each interview here. Don’t forget to follow us on Twitter @socialscibites.

View all posts by Social Science Bites

Related Articles

Joshua Greene on Effective Charities
Social Science Bites
December 2, 2024

Joshua Greene on Effective Charities

Read Now
Julia Ebner on Violent Extremism
Insights
November 4, 2024

Julia Ebner on Violent Extremism

Read Now
Nick Camp on Trust in the Criminal Justice System
Social Science Bites
October 1, 2024

Nick Camp on Trust in the Criminal Justice System

Read Now
Daron Acemoglu on Artificial Intelligence
Social Science Bites
September 3, 2024

Daron Acemoglu on Artificial Intelligence

Read Now
Iris Berent on the Innate in Human Nature

Iris Berent on the Innate in Human Nature

How much of our understanding of the world comes built-in? More than you’d expect. That’s the conclusion that Iris Berent, a professor of psychology at Northeastern University and head of the Language and Mind Lab there, has come to after years of research

Read Now
Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson’s work finds little success in applying reforms derived from certain types of social science research on criminal justice.

Read Now
Rob Ford on Immigration

Rob Ford on Immigration

Opinions on immigration are not set in stone, suggests Rob Ford – but they may be set in generations. Zeroing in on the experience of the United Kingdom since the end of World War II, Ford – a political scientist at the University of Manchester – explains how this generation’s ‘other’ becomes the next generation’s ‘neighbor.’

Read Now
5 1 vote
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Sean Maceoin

Two fairly recent related articles on confirmation bias: McSweeney, B., 2021. Fooling ourselves and others: confirmation bias and the trustworthiness of qualitative research–Part 1 (the threats). Journal of Organizational Change Management34(5), pp.1063-1075 and McSweeney, B., 2021. Fooling ourselves and others: confirmation bias and the trustworthiness of qualitative research–Part 2 (cross-examining the dismissals). Journal of Organizational Change Management34(5), pp.841-859.