Ep 297: Raising Skeptical Thinkers

Andy Earle: You're listening to Talking to Teens, where we speak with leading experts from a variety of disciplines about the art and science of parenting teenagers. I'm your host, Andy Earle.

So many of the things we are told or read about claim to be based in research and in science, but aren't actually true. Or if they are, they're a lot more nuanced than we might be led to believe.

We might think that we're discerning people and we do a good job of analyzing every claim we hear. But the truth is that even the smartest and most informed people among us often believe things that aren't really true.

How can we get better at telling the difference? And how can we raise teenagers to know how to spot the truth.

We're here today with Alex Edmans, a professor of finance at London business school, whose Ted talk, what to trust in a post truth world has been viewed 2 million times. And he's the author of the book May Contain Lies.

Alex, thank you so much for coming on the podcast. Really excited to have you here.

Alex Edmans: Very excited to be here.

Andy Earle: So you have written a book called May Contain Lies all about how we often believe things that the evidence or data might not necessarily support in the way that we think it does. talk to me a little about that. What led you to be interested in this topic and writing this book?

Alex Edmans: That's because I see this misinformation in finance and economics, but as we'll shortly discuss, it's much broader than that. So why do we see this misinformation? because of what's known as confirmation bias. that's the idea, that we have a view of the world.

And if we see some evidence supporting that viewpoint, we will accept it uncritically, without even checking whether it's true. And if we see evidence that contradicts our worldview, then we might not even read it to begin with. Or if we read it, we might read it with the mindset of picking it apart. And that was what I typically encountered, is people would respond to the research based on its appeal. Not its accuracy. But then when I branch beyond just economics and finance, I see this in many fields.

Before my son was born, I went on a parenting course, and they told us we should exclusively breastfeed our kid for the first six months. This was based on rigorous evidence by the World Health Organization that breastfed babies do better than bottle fed babies, along many things.

Andy Earle: And it's better for mom, too,

Alex Edmans: Better for mom, better for recovery, better for maternal kid bonding. All of these things, and again, based on evidence, and the evidence suggests that breastfed kids do better, that played into my biases, because you like to think something natural is better than something artificial, but when you dig a little bit deeper into the data, there's other things going on.

So who are the mothers who are able to breastfeed? Breastfeeding is really tough, so it's hard to do without a supportive spouse, it's hard to do without family support, maybe they've got help outside the house, and maybe it's those things that drive the health outcomes, the IQ outcomes, the better maternal recovery, not the breast milk itself.

When you strip out the effect of those other factors, the effect of breastfeeding is pretty much zero. And that's striking, because often mothers are guilt tripped into thinking that they're not a good mother if they reach for the bottle because of all of this really overwhelming evidence, when actually the evidence is not so strong.

And so this is why I thought let me write a broader book. More than just economics and finance about how we need to be very careful with information that we see.

Andy Earle: I thought that was so interesting when you talk about breastfeeding in the book and really break it down and how you went through this journey and looking at the research and also confirmation bias.

And you talk about an aspect of confirmation bias called belief polarization. What does that look like and how does that play out?

Alex Edmans: Yeah, so you might think that the solution to any biases is to get more information, and isn't information the opposite of misinformation, so if we see the same evidence, if we put all the facts on the table, then everybody will agree, but it's not as simple as that, why?

Because how we interpret the information depends on whether it supports our worldview. So there was a study, on some undergraduate students who had strong views on the death penalty. some of them were very pro death penalty, some of them were very anti, and they were given two studies.

One of them supported the death penalty, one of them opposed the death penalty. And after reading those studies, were asked to criticize them, and they only criticized the ones that disagreed with their viewpoint, even though the studies were of the same quality. And afterwards, they were asked have your views changed?

And they became even stronger believers in either pro death penalty or anti death penalty, even though they'd seen balanced evidence, one for and one against. Why? They'd latched onto the one that supported their viewpoint, they dismissed the other, and so the beliefs became even more extreme. we think The solution to misinformation and polarization is to put the facts on the table.

But if we come to those facts with a preconceived viewpoint, then we will not come to even balanced information with a balanced head.

Andy Earle: I think that's so true. One of my psychology professors used to say that it's funny when you give students a study that contradicts their views or how they think of the world, they become enigmatic.

Experts in research methods. They're great at picking it apart and saying all this stuff that's wrong with it, or doesn't make sense with it. But I think we do the opposite when something does align with what we want to believe or with our pre existing views, then we don't really pick it apart or we don't really start to look critically at it.

Alex Edmans: And that's a really important observation and this might have some practical implications for parents of teenagers. So if your teenager sees a study that he or she really wants to be true, one tip is ask them to imagine the study found the opposite result. So if it found the opposite result, how would they try to knock it down?

let's say I want an excuse to drink lots of red wine this evening. I find some studies that people who drink red wine live longer. Now, I want to believe that. Now, if I imagine that it was the opposite result, people who drink red wine live shorter, I would knock it down by arguing maybe people who drink red wine are poor.

They can't afford champagne or top level stuff, so they're having red wine. And maybe it's their poverty which leads to the shorter life. So now that I've alerted myself to the possibility of these all turn vexillations, ask whether these all turn vexillations still hold.

Maybe the reason why people who drink red wine live longer is they're able to afford red wine rather than something like beer that might be cheaper. And so I think this is so useful. You might think, okay, as a busy teenager, they don't have the time to go into the weeds of a study. They might not even be studying statistics at school, but this is just common sense, right?

We apply our common sense when we see a study we don't like. The trick is to activate that when we find a study or a story that we do

Andy Earle: I love that. it's such a actionable and useful way to think about it. And so often we find our kids are just rattling off something that they read on social media or that they heard someone talking about a study that was mentioned in the paper.

And we're so far removed from the actual research that's being done. I guess One solution is, oh, we'll go to the research and actually look at it critically, but I love what you're talking about because it can be so much simpler of just this mindset shift. You talk in the book, not just about the confirmation bias, but about the twin biases that are all behind a lot of these kind of accepting things that we hear without really critically looking into them.

So what is the other bias?

Alex Edmans: Yeah, thank you. So confirmation bias is the idea that we have an existing viewpoint, and we interpret evidence according to whether it agrees with that viewpoint. But you might think there could be other topics on which we have no existing viewpoint.

And that's where the second bias comes in, which is black and white thinking.

So we might not have a preconceived viewpoint, but black and white thinking means that we think that something is either always good, or always bad, there's no shades of grey. And again, let me give a practical example. So the Atkins diet was a diet focused on carbs. So why is carbs interesting? Because protein, most people think, is good.

We learn that it builds muscle. Fat, most people think is bad. It's called fat because it makes you fat, it must be bad. But carbs are not so clear cut, so most people might be pretty neutral on carbs. But if they have black and white thinking, they must think it will be either one way or the other. And the Atkins diet said, let me take one extreme position, which is to say, avoid all carbs or minimize carbs to the extent possible.

And that went viral. Why? It played into black and white thinking. It was very easy to implement because people following a diet, they only needed to look up the carbs label on a nutrition information. They didn't need to think, is this complex carbs or simple carbs?

No, if it's not high carbs, I'm just going to avoid it. But interestingly, if Atkins had proposed the opposite diet of eating as many carbs as possible, That might have also gone extreme, why it all so plays into black and white thinking. And to be a bestseller, Atkins didn't need to be right, he only just needed to be extreme.

And so a lot of things that teenagers fall for right now, these are things which are widely shared on social media. do X as much as possible, or do Y as little as possible. These are things where there's no shades of grey, there's no nuances. Because it's simple, it's easy to remember, it gives clear actionable messages.

But in life, most things are shades of grey. Most things are, even if they're good, they're only good in moderation. we need to be careful that we are not making some extreme conclusions.

Andy Earle: Yeah, I love that because those are the kind of things that get the likes and are more likely to go viral

We get it. It's taking a really strong stance on something. And also it strikes me with the Atkins diet that it appeals to our confirmation bias or that we want to believe that, Oh, I can lose weight. I can lose 30 pounds in a month. And I can still eat bacon and not have to be, really restricting myself but just follows one simple rule.

It's like a perfect storm of appealing to both of those twin biases.

Alex Edmans: Yeah, so if something is easy, and if something we want to be true, then this will go viral. If it's easy to share with our friends, then it's something that people are much more likely to share. So why do we share stuff? Again, people are not buyers for bad reasons. They want to be helpful, and if there's the message, this is awesome.

They really think they're helping their friends by sharing it. And if it's easy, they think this is a practical, actionable tip that my friends can use. So I know a lot of people like complain about teenagers and say, oh, they're spreading misinformation.

They're falling for things that are simple but life is complicated. And so simple messages do cut through realistically. My point here is often things which are simple are actually simplified. Sadly, the real world is more complicated, and even though the more nuanced message might be more difficult to put into practice, it is something which is more based on evidence, such as the advice to eat, say, five portions of fruit and vegetables.

It's not zero. Avoid all fruit and veg, that would be easy. It's not eat as much as possible, that would also be easy. Here you might have to track and account this, but that's something which is more consistent with the scientific evidence.

Andy Earle: And yeah, you point out in the book different ways of thinking about different categories of things that real situations might fall into other than being black and white which I thought was really interesting and would encourage people to check out.

You also have this ladder that you keep coming back to in the book with different rungs of the ladder Can you walk me through this? Or where does this come from?

Alex Edmans: Absolutely. So we've talked about the problems. We've talked about the biases, which cause us to fall for misinformation.

Let's look at the solution. So you might think the solution is just to check everything, and that is the solution, but that seems unrealistic, right? to check everything, there might be 175 different ways in which people can misinform us, and realistically, we're not going to remember all 175 ways and put this into practice.

So what I wanted to do was to drill down and boil down all the types of misinformation into just four Categories. Which any hard time press person can hopefully think about. And so I illustrate this in a graphic which I call the ladder of misinference. So why do I use a ladder? When we start from some facts and we draw conclusions from them, we are climbing a ladder.

But why I call this a ladder of misinference is when we climb the ladder, we make missteps up it, we misinfer things that we actually should not be doing. So let me go through the four steps in turn. So the first step is a statement is not fact. it may not be accurate. So we like to quote stuff, but sometimes we might quote things out of context.

So the stakes of this can be really high. So this is not just about academic accuracy, these really might be in life or death situations. So one sad example is the opioid epidemic in the US. So I believe 650, 000 people have died over 20 years because of opioid. overdoses. Why were they prescribed so freely?

Because of an article in the New England Journal of Medicine, highly respected, where the title was Addiction rare, inpatients, treated with narcotics. This gave the impression that you can treat people with narcotics, they won't be addicted. But this was not actually a study. Yes, it was the New England Journal of Medicine, but it was a letter to the editor.

So it was a reader's opinion piece. But this has been cited 1, 650 times, people quote it without actually looking at what it was, which was a letter. And it was also a letter which looked at people who were given opioids in a hospital, where it clearly is controlled and prescribed, but if you give this to outpatients, they might end up overdosing.

So often people quote this, and the quote was accurate, it literally did say, Addiction rare in patients treated with narcotics, but they were quoting this not knowing the context, it was a letter, it was based on hospitalized patients. That's the first step. Check not only whether it's something true, But check the context.

And even that highlights that when we think about misinformation, it's much broader than what we typically think. We think about misinformation as being complete, outright lies.

Here, that statement was literally what the article said. It wasn't that we were misquoting. We just didn't see the context. As facts are not data, because they may not be representative. So we might like to quote a single fact, a single example of somebody who happened to be really successful.

So let's say your teenager says, I don't need to study to go to university. Kim Kardashian became really wealthy without going to university. That is true. Again, that's 100 percent true statement. But it's an isolated example, we need to look at lots of data to try to find this general conclusion.

So we want to look at lots of people who did go to university, how successful they were, and lots of people who didn't go to university, and how successful they were.

So again, we can't be swayed by just one isolated case. Let's look at the general data. The third misstep is, even if we have large scale data is not evidence because it may not be conclusive. It could be that there's a correlation, but there's no causation. So if I go back to my breastfeeding example, there was data across thousands and thousands of babies showing that breastfed kids have better outcomes, but that data is not evidence because it allows multiple explanations.

It doesn't pinpoint one conclusion. What's the difference between data and evidence? Where do we find the word evidence often used? In a criminal trial. So in a trial, evidence is only evidence if it points to one particular suspect.

The final step is evidence is not proof. It may not be universal. So even if you've nailed down evidence in one particular setting, it might not apply in other settings. So there's the famous 10, 000 hours rule, popularized by Malcolm Gladwell, that if you spend 10, 000 hours, working on something and practicing something, you'll become an expert.

He claims that this applies to everything from chess to neurosurgery, but the evidence he quoted was on violin playing. And violin playing is a very different context to neurosurgery. With violin playing, you're playing the same sheet music. Yes, you could practice it many times and get better.

Whereas with neurosurgery, each setting might be different from another. So what works in one might not be true in another case.

Andy Earle: And you also point out that the violinists in the study had already gotten into a really prestigious music school. So it's not necessarily a fair study of taking everybody who randomly assigning people to practice for 10, 000 hours or not and seeing what outcomes happen.

Alex Edmans: You're absolutely right, Andy. And actually, this 10, 000 hour study, even though it's so viral, it actually falls at every single misstep of the ladder. So it's focused on the final one, which is over extrapolating from violinists, but it didn't even show in the field of violinists.

That practice makes perfect, because it looked at people who were already in the prestigious Berlin Academy of Music and it could be there's lots of people who practice for 10, 000 hours and never even got to the Academy, but they were not in Malcolm Gladwell's data set.

Andy Earle: That's going to motivate people to work harder and to put in more effort.

Alex Edmans: Absolutely. So you might think, am I just nitpicking as being some sort of nerdy academic when actually we just want to get our kids to practice and we don't care so much about the scientific accuracy. And I think it does matter.

Why? For a couple of reasons. So number one is Gladwell gives the impression that what matters is the quantity of practice. rather than the Quality of practice. For example, with music, and I was a musician when I was young, like some of the most effective practice will be recording yourself and watching yourself play.

Now that people hate, right? It's really embarrassing to listen to yourself sing, or to watch yourself public speaking, or to see yourself play an instrument. Sometimes people just don't like that because the recording will highlight all the flaws that you are making, but that is the way of practicing.

Often people just go through the motions, just repeat the same song many times. That is not what was in the underlying evidence as being practiced. Also, some people interpret Gladwell as suggesting that you only need 10, 000 hours. to be successful. You don't need talent. And again, why his book has been so successful is it's empowering.

We like to tell our kids, you can do anything that you put your mind to. You don't need talent and you don't need genetics. And Glada will certainly write that you don't just need genetics. You do need to work hard. But some people have interpreted his study as suggesting that irrespective of ability, if you just want something hard enough, then you'll be successful.

So if I wanted to be a famous pop star, I should just sing for 10, 000 hours and practice. But sadly, I don't think I'd be successful no matter how much I tried. And so sometimes we do need to give our kids direction. Yes, it's great to have dreams, it's great to have ambitions, But time is the most precious resource.

And to try to get our kids to focus their time on things which they're clearly passionate about, but also to which they might have a natural aptitude and a natural skill, that is something where I think that is, even though that might not be as freeing as you can just do anything, that I think that's what's going to lead to the best long term outcome.

Andy Earle: I think it so perfectly appeals to those twin biases that you talk about, because it gives us a nice benchmark 10, 000 hours. You either practice that much or you don't. This is the number. This is what you got to do to make it. We want to believe that because, if we're not an expert in something or we're not at the top of our game and something we want to believe that we can be if we just work hard enough.

And if we are, we want to believe that it's because we work so hard. It's not because we just were born this way. And we just happened to Have these abilities and we, or we got lucky and had a successful break in our life or something.

So this 10, 000 hour rule is set up in such a perfect way to appeal to those twin biases you talk about.

Alex Edmans: Absolutely, and then the flip side for that is if something goes wrong, then you end up blaming yourself. And I think this is really important for teenagers as well, because if the idea is that success is in your hand, if you just work hard enough, if you get your 10, 000 hours in, then if you end up not getting into the sports team, or if you end up not getting the exam results, or not getting into the university, it was your fault.

You just didn't work hard enough. And again, we do want to encourage effort, but if we give the impression that it is just due to that and there's no luck involved, then people can beat themselves up a lot when they don't achieve a particular outcome.

We want to have this healthy balance of encouraging people to try and have the effort, but it's not as mechanical as the rule would suggest. There's a lot of other stuff going on here. And yes, we try, but we don't beat ourselves up too much if we end up not getting what we wanted.

Andy Earle: Yes, indeed. Wow.

There's really so much in this book to get people thinking. There's interesting examples and so much research that you've done here.

I really encourage people to check out a copy. It's called may contain lies, how stories, statistics and studies exploit our biases and what we can do about it. Alex, thank you so much for coming on the show today and speaking with us about your work and sharing stories from your life and from your experiences.

Alex Edmans: I really enjoyed the conversation, Andy.

Andy Earle: Can you talk at all, on where people can go to find out more about you, about what you're doing, maybe follow updates from you or anything like that?

Alex Edmans: My website is alexedmans. com, that's E D M A N S. I'm also active on LinkedIn so under the username A Edmans. I also have the book maycontainlies. com. That's not just about the book itself, but if there's more misinformation which came out after the book was published, then I will try to share this on the website.

Andy Earle: Interesting. Yeah, encourage people to check it out. Hope that comes about soon.

Alex Edmans: I would not be the person to prepare this because then that might be seen as self interested, but if there was an independent body, because there's a lot of books, particularly in the self help genre, which are based on confirmation bias and not really on information, and there's no central repository for a discerning person to know which books to trust and which ones might have concerns with them.

Andy Earle: Alex, thanks again really appreciate it. and I hope that people will check out the book.

We're here with Alex Edmans talking about how to tell when your biases are leading you to believe something untrue. And we're not done yet. Here's a look at what's coming up in the second half of the show.

Alex Edmans: Is that what you're concerned with? Are there other things that you're concerned with than you'd like me to address? And that immediately changed the tone of the meetup. From trying to fight against each other, this highlighted how we had the comedy goal.

Now, they didn't have to agree with what I did. But they at least had to understand the thought process and the logic behind what I'd done.

And so let's say how do I apply this to parenting? I'm not a parent of a teenager. My son is only two and a half years old, but sometimes at night, if he, if I'm trying to put him to bed, he might want his mum.

I'll say, mummy is tired, do you want mummy to get better? And then he will say, yes. And then after establishing this, then I'll say then can daddy put you to bed and mummy rest?

And so rather than saying why he had to let daddy do it I wanted to first start by just having some common ground. And the commonality is that, yes, he would be happy with his mum having a rest.

Andy Earle: Want to hear the full interview? Sign up for a subscription today. It's completely affordable and your membership supports the work we do here at Talking to Teens. You can now sign up directly through Apple Podcasts. Thanks for listening, and we'll see you next time.

Creators and Guests

Andy Earle
Host
Andy Earle
Host of the Talking to Teens Podcast and founder of Write It Great
Alex Edmans
Guest
Alex Edmans
Professor of Finance, LBS. Purposeful business, responsible investing, behavioral economics. Author: Grow the Pie; TED Talk: What to Trust in a Post-Truth World
Ep 297: Raising Skeptical Thinkers
Broadcast by