Ep 109: Can Your Teen Spot the Truth?
Andy:
Can you talk a little bit about where did this come from? What inspired you to write this book about fake news?
Cindy:
Sure. So I am a former CIA officer and when I left the agency in the middle of 2017, it was a pretty interesting time for our country. It was just a couple of months after the public had started to grapple with the fact that Russia had interfered in our presidential election, and people had questions. Naturally they had questions about what was Russia doing? What was their aim? What were they attempting to do? How much did they influence the election? Does Russia always do this? Are there other countries that do this? How does this all work when a foreign country uses things like false information to influence and manipulate foreign events?
Cindy:
So I had started to do a little bit of writing publicly to explain some of these things. And I just kept getting questions from people who just simply wanted to know how could they sort out the true from the false in their social media feeds and online. And it became clear to me that the skills that I had learned and used as a CIA analyst were really relevant to people just trying to make sense of their newsfeeds. So I started putting together all of those answers that people were looking for into a book and there it is.
Andy:
Well, and it's more than just simple answers about how to look at your newsfeed. This book to me provides so much perspective on, I mean, I love... The first part of the book goes totally through the history of fake news, which was fascinating to me because we think of it as a recent phenomenon, it's a thing that happens on social media. But no, you point out it goes all the way back to ancient Egypt, the Pharaoh Ramses, and you walked through all these really, really interesting examples all throughout history of how fake news has been used to sway popular opinion.
Cindy:
Yeah. The historical piece was really important to me to talk about in the book, because in 2016 and 2017 where most people were sort of learning about this for the first time, there was of course a lot of panic around it. And so I felt like it was important to show that this isn't a new issue. And while technology is certainly a game changer when you're comparing, for example, Ramses the second in his use of false information to today, and what we see on social media, of course technology is a huge game changer on that front. But so many of the false narratives that people have used throughout history and the way that they've used false information, that's been the same throughout so many different important events and time periods that we know about today. So the historical part, I tried to use a lot of historical vignettes or figures that are well-known today. People that most of us have probably heard about or read something about at one time or another.
Andy:
Yep, absolutely. And even going back to early presidential elections and I'm thinking this is not even a new thing in terms of that. Going back to Thomas Jefferson and Benjamin Franklin.
Cindy:
Yeah. I think in the United States, at least sort of, for us, we think of politicians as... Like we're always a little bit suspicious of politicians culturally.
Andy:
They have an agenda or something.
Cindy:
Yeah. They have an agenda like mudslinging is sort of normal. Like we just sort of expect politicians to say untrue things. And it does go all the way back to the founders, who, many of them either invested in or ran or financially supported different newspapers as a way to have favorable articles written about them and unfavorable articles written about their political rivals.
Andy:
Smart.
Cindy:
It's smart. It's terrible but it's smart. I think we've always had this deep seated mistrust in politicians. But I think when we think about modern day political elections, the idea of politicians using false information to win votes is certainly nothing new. It's just, again, that technology piece that is the difference.
Andy:
I love this phrase that you have in here. I don't know how to say it, but there's actually a word that the Nazis used to talk about foreign newspapers or reporters that were saying something that was critical of the Nazis and it translates to lying press. So it's almost like a precursor of the word for the fake news or that's what we're calling it now, but there have actually been other terms for it even in the past.
Cindy:
Yeah. It's really interesting because fake news as a term in the English language has certainly been distorted in recent years by not just American politicians but political leaders around the world even who have used it to target legitimate press and press freedoms in their countries. And even that is not new either. And that was sort of the point of talking about the Nazi party and its war against legitimate press and press freedoms in Germany as a way to hide the truth of their actions, to villainize, marginalize communities in their country and other countries as a way to justify horrific, horrific violence against them. And also to encourage their own population that was not being targeted from believing anything that the foreign press was saying or reporting about what was in their country. So it has been very effectively used as a term by world leaders throughout history to target press freedoms.
Andy:
And just to cast some doubt and to create confusion and to make people uncertain of what's going on is a really powerful tactic. You actually have a quote in here from one of the executives at a cigarette company in 1968 saying, "The most important type of story is that which casts doubt in the cause and effect theory of disease in smoking. Eye grabbing headlines should strongly call out the point, controversy, contradiction, other factors, unknowns."
Cindy:
So this is the chapter that I wrote on what's called big tobacco. So the large tobacco corporations throughout the fifties and later who banded together to push back against scientific the scientific evidence showing that smoking does lead to cancer and other serious health conditions. So they essentially banded together to push out this information campaign to try to cast doubt on the medical findings, the very clear evidence that smoking leads to cancer. And they did this whole marketing campaign for decades. Their ultimate goal was to keep people smoking, to keep getting more and more people smoking, but it wasn't necessarily to prove the evidence as wrong. It was mostly to cast out and therefore...
Andy:
Confusion.
Cindy:
Confusion, exactly, because ultimately you don't necessarily need to convince people of your side of things. You just need to make it so that they don't trust the evidence that's coming out and therefore in their minds, those people would continue to smoke. So that's a lot of what we see today too, particularly on issues related to health and science, is they're not necessarily looking to convince you 100% to change your opinion, but just really to not trust the accurate sources of information. Create enough confusion that you feel like you can't trust otherwise very trustworthy organization.
Andy:
Right, right. To make the consensus seem like less of a consensus and a lot more of like, "Well, there's a lot of different voices and a lot of people think different things."
Cindy:
Up to your opinion. Yeah. Go with your gut.
Andy:
Yeah, exactly. Some people believe this. Some people believe that.
Cindy:
Exactly. What can you do?
Andy:
Yeah. This is interesting. You talk about a study, a 2018 MIT study, about the biggest news stories in English that were shared on Twitter between 2006 and 2017. And the researchers look at 126,000 stories shared by millions of users. And they found that fake news and rumors overwhelmingly reached more people and spread six times faster than true stories. Fake political news, more than any other category of false information on social media reached more people faster and went deeper into their personal networks.
Andy:
So, wow. If that's what the truth is up against, the fake stuff seems way more viral.
Cindy:
Yeah, and that's for a couple of key reasons that I think are really important to talk about, which is false information really thrives in environments of chaos, of high emotional tension, of political division and that sort of thing, because I think we're all feeling it right now. I think as a society we're feeling very stressed. We've been living in a global pandemic, which is certainly something my generation has never experienced and the younger generations have never experienced. We have serious political division. We've had national unrest over police violence. It's just an incredibly tense time. And that is the perfect scenario for people looking to spread false information because we're just all very emotional. And when we're feeling emotional about content, we're less likely to think critically about it. We're more likely to trust our opinions, our guts, and that sort of emotional reaction.
Cindy:
So if you see some very politically divisive content that confirms your preexisting views of a situation, or you feel very, very strongly about that piece of political content, you're more likely to share it than you are to be like, "Okay, let me click on this link. Let me follow it. Let me see where this article leads. Let me investigate an article. Who's posting it? Where is it coming from? Is this a corporation? Is this a politician? Is it a news website? Can I trust them? Do they have things that I can verify, research that I can do, that sort of thing?" We're just more likely to share it.
Cindy:
I don't know about what your experience is like on social media, but I spend a lot of time on Twitter, which I probably need to cut back. But I always find that my little pithy, I don't know, humorous, or just sarcastic, those kinds of tweets, those are the ones that I get tons of likes on, tons of retweets on. It's not the 25 tweet thread that I do on the nuance in historical context of this particular topic, right?
Andy:
Right. Snappy soundbite.
Cindy:
Yeah. When I do those, I always get retweets that are like, "Well, this was very long, but it's worth it, I promise, you can read to the end." But I get way more engagement from tweets that are sort of like Cindy's hot take of the day rather than Cindy's well thought out, researched, analyzed thorough, twenty-five point thread. So that's really the issue.
Andy:
Right. We don't necessarily want to go wade through all of it. We want to kind of just trust that you're the expert.
Cindy:
Yeah, it does make me very nervous when people respond like, "Oh, I always trust whatever you say." I'm like, "No, no, please don't."
Andy:
No, that's exactly the point. Don't do that.
Cindy:
So let's talk about that, right? I am not an expert on everything. I try to weigh in on things when it makes sense for me to weigh in on things, but you should absolutely not trust 100% of everything I say, because I'm a human, I'm not an encyclopedia. I try to be thoughtful, but the vast majority of social media doesn't and therefore independent research is still very important.
Andy:
You have a study in here actually confirming what you were just talking about. They gave people different statements and had people classify whether they were a fact or an opinion. And regardless of what political party participants belonged to, they were more likely to say a statement was a fact when it appealed to their political beliefs, more likely to classify it as an opinion when it was something they disagreed with.
Cindy:
Yeah. I haven't done as much research into how this varies across different countries in the world. But as Americans, we sure are very confident in ourselves. We tend to trust our gut as we've talked about a couple of times now. Because of our own biases, things like cognitive dissonance and that sort of thing, we are always interested in finding information that confirms what we already believe. And we make our beliefs around very little evidence. We tend to make snap judgements. And as a result, all of that plays into how we see what is fact versus what is opinion.
Cindy:
And I think that study is a really great example, it's based on political issues, primarily things like stance on minimum wage and things like that. And when you believe that the minimum wage should be raised or it shouldn't be raised, and you say it enough times to yourself and you hear it enough in your favorite news broadcast that this is what needs to happen, your brain starts to naturally look at that as it's fact, it's not opinion.
Andy:
Minimum wage it too low. Yeah, that is...
Cindy:
Yeah, exactly.
Andy:
That's incontrovertible.
Cindy:
Exactly.
Andy:
That's interesting. Yeah. So those things kind of shift and the more they get repeated, the more they crystallize and solidify from opinions over to facts in your brain.
Cindy:
Yeah. And the people who push out false information, whether it's for ideological reasons or it's financially motivated, they know that. They know that repetition is a huge piece of this. I not only write about this stuff, but I actually work in my day job doing this kind of investigative work as well. So I spend a lot of my time hunting down networks of people, groups, companies, governments, that sort of thing that they've created elaborate networks of social media accounts and social media groups and pages where they just spam content. They're just posting content all day long. The same kinds of narratives, the same false claims all day long, because they know when it shows up in your social media feeds enough times, you start to think that it's true.
Andy:
See it enough times.
Cindy:
Right? Yeah. It starts to look familiar to your brain and you're like, "Oh, that must be true." So repetition is huge.
Andy:
You point out, actually yeah, there's the famous meme after Melania Trump's speech that she gave and everyone was saying, "Oh, hey, you copied this from Michelle Obama." And it totally went viral, huge thing. And then actually is made up. She didn't actually even say that. But you had a nice point I thought that, well, it was really well accepted by people because it's something that they had heard before, she'd been accused of that before from something totally different. So that then just is maybe in the back of your mind or something. And so then when you hear this story that, "Oh yeah, she copied just word for word, look it, it's right there on the meme. She copied exactly what..." You don't go and actually trying to watch the whole speech and find wherever it is in the speech that she said that and confirm, you just, "Yeah, and I've heard something about that before too. She did that before." Retweet, share, "Get a load of this people."
Cindy:
Yeah, exactly. And so much of the campaigns that people wage to spread false information are also built beside the emotional piece of it, the repetition piece of it. They're also built around leading with a kernel of truth. So there was a kernel of truth in this story in that there was a previous instance in which some language had been very, very similar in speeches. And so the next time that it was claimed that it had happened, people accepted it, because there had been this kernel of truth in it previously. And our brains just aren't very good, with the speed of which information travels and the amount of content that we're seeing every day, we just don't take the time, frankly, to sort out these things.
Andy:
We don't have time.
Cindy:
We don't have time.
Andy:
Yeah, how long would it take to go watch their speeches? And as you pointed out, actually, when the meme was first posted, Melania's speech wasn't even available if you wanted to go and fact check this and look and try and find where she said this quote and copied Michelle Obama speech. You wouldn't even be able to because the White House hadn't even made the tape from her speech available yet. So good luck trying to verify. Yeah.
Cindy:
Yeah, exactly. And that's another point that I raised throughout the book. That's a good example. I also talk about it in a chapter that I have on Marie Antoinette in which having what we call an information vacuum plays a key role in false information spreading. So when there is an absence of information out there, then swoops in the purveyor of false information to say, "Well, here's what's happening," or "Here's the information." And because there's no other information to compare it to, you go with the false story. So with Marie Antoinette, Royal Court in France, in the lead up to the French Revolution, it wasn't any of the business of its citizenry to hear about what was going on in the Royal Court. So they didn't share anything or they shared very little.
Andy:
Right. Closed doors, not your business.
Cindy:
Exactly. We are anointed by God, be on your way was the belief. And so there was a lot of room for individuals looking to spread false stories about Marie Antoinette and the rest of the court to make things up because what else did people have to believe or to consult, right?
Andy:
And so as you point out, I thought this was so interesting. This is, making these pamphlets where they would take just basically one giant sheet of paper and print so that then it can be folded up into 16 pages or something like that, and be just easily duplicated. You make thousands of them, fold them up and just start passing them out. And even though a lot of people at the time couldn't even read, putting these things all around out there, there, I guess, were enough people that could read that it made you curious what they said and you asked somebody to help you figure it out.
Cindy:
Yeah. I mean, literacy rates were certainly growing at the time, but a lot still traveled via word of mouth as well, which sort of creates another opportunity for distortion. I always talk about the telephone game. And I think I lead a chapter with the telephone game. It's something I played as a kid. You start with a particular message and absolutely every single time, by the time you get to the end of the line, the last person has heard something very different from the first person. And that's certainly...
Andy:
Not even in the same ballpark.
Cindy:
Usually not at all. And so certainly back in older generations, when things were traveling via word of mouth, this was a huge risk. But social media also acts like this in a lot of ways too because another thing I talk about is that the vast majority of Americans don't click on articles, they just read the headlines and share.
Andy:
It's 56% or something that... There was an actual study on that.
Cindy:
There was a study on it. It's horrifying.
Andy:
I can't believe that. Yeah. So more than half of the stuff you see in your newsfeed, whoever posted that, they didn't even read it. They just said, "Oh yeah, it looks good."
Cindy:
Yeah. And a headline is a very, very short summary of the key thing that the article is trying to convey, but usually an article of course, has much more nuance. And so the vast majority of Americans just read the headline and then share, and then you add to it, everybody's got to have a hot take, right? So I read the headline, let's say, I have a hot take on this issue just strictly based off of the headline. Then somebody retweets me and they have a hot take on my hot take, which wasn't even based on the article because I didn't read the article. Hypothetically again, of course I read the articles.
Andy:
Oh, yeah, right.
Cindy:
Of course. But that's how you get very, very garbled messages on social media take off and go viral.
Andy:
Yeah, right. And they get snowballed and just blow out of proportion so quickly.