Ep 289: The Art of Detecting Teen B.S.
Andy Earle: You're listening to Talking to Teens, where we speak with leading experts from a variety of disciplines about the art and science of parenting teenagers. I'm your host, Andy Earle.
We're here today with John Petrocelli talking about bullshit. How do you know when your teenager is bullshitting you?
And how can you make sure you raise a teenager who's not susceptible to believing bullshit that they hear from other people?
We live in a world where bullshit seems to be on the rise. People can say anything they want and often it goes without being questioned.
Research shows that as parents, we are in many ways, susceptible to being bullshitted by our teenagers.
If you don't want to let your teen get away with bullshit, and if you want to raise a critical thinker who can evaluate claims effectively, then you need to understand the science of bullshit.
John Petrocelli is an author and researcher, and he wrote the book on the topic.
John, welcome to the Talking to Teens podcast. Thanks so much for coming on the show today.
John Petrocelli: Thank you, Andy. Thanks for having me.
Andy Earle: Wow. You have a very unique area of research in the field of bullshit. You've written this book, The Life Changing Science of Detecting Bullshit. Super interesting. Lots of stuff to talk about in this book.
You get into the science of bullshit. And what it is, where it comes from, why we do it, why we believe it. How'd you get into being the bullshit guy?
John Petrocelli: Yeah I've been conducting empirical research on the behavior of bullshitting for about the last 10 years.
And we accumulated enough empirical evidence to put together some general guidelines, which serve as what we call antecedents or the conditions under which people behave in the behavior. We're ultimately really interested in the consequences of the behavior.
And it appears there may be some immediate, maybe temporary, benefits for the individual bullshitter. But the more long term consequences for society are much greater. So it might help a bullshitter to sound knowledgeable, maybe persuade others into believing something is true.
And it may or may not be true. And when you believe something that isn't true, that can lead to a lot of bad decisions.
What we find is bullshitting is often confused for lying. You can't just tell participants in experimental studies, see if you can find the bullshit here. What they'll often look for are lies. So people confuse bullshit with lies but these are two very different things. It's true that both the bullshitter and liar are trying to portray themselves as actually caring about the truth.
But only the liar actually cares about the truth. They're trying to distract us from the truth in some way, or some aspect of the truth. But the bullshitter doesn't care what the truth is, they're not interested in it. And every once in a while, the bullshitter is actually correct just by chance, by accident, but the bullshitter wouldn't even know that they're correct.
We find this really big difference. And once we introduce people to what we mean by bullshit, they definitely know the difference. And there's certainly quite a bit of evidence that we've accumulated now that people treat bullshitters and liars very differently.
So liars usually get marks of disdain. We don't trust them. Furthermore, maybe they'll need to tell us 300 truths for us to view them as honest again. With bullshit, we usually give bullshitters a social pass of acceptance. We just say, oh Andy's just bullshitting. And we just leave it at that and we assume it doesn't have any consequences.
And this is really where we cannot be more wrong. Because all of our research so far suggest if anything, bullshit can have a greater effect on memory and learning and beliefs about what we believe to be true. And what we believe to be true is foundational to optimal decision making.
Once people label something a lie, they tag it as false and it doesn't have an effect. And when they see it again, they know that it's not true. With bullshit, maybe it is true. Maybe they're actually on to something. You have to do the work. So even if someone is bullshitting us, Andy, and I tell you, oh, Bob, he's just bullshitting us, it could still be true.
He's not lying. And so our reactions to the bullshitter and the liar are different. And, unfortunately, the impact of bullshit can be longer lasting because if we don't do the work, we don't do the fact checking, we don't do the due process of investigating whether or not it is or isn't true, oftentimes it's false, and then it's going to have an impact on all sorts of things. And I happen to believe that pretty much all of our problems, whether they're personal, interpersonal, societal, financial, in one way or another they're connected indirectly or directly to mindlessness and what I call bullshit based reasoning.
And this is reasoning based on what we might hope to be true, what we wish to be true, what we would like to be true, but we're not really sure it's true. We don't really care whether or not it's true. And again, this is just a pretty deleterious effect on judgment and decision making relative to lies that we can label as false.
Andy Earle: It sounds like the main difference between lying and bullshitting is you don't actually really know what you're talking about when you're bullshitting. You don't really know what the truth is. You're just riffing or talking out of your ass a little bit.
John Petrocelli: Yes. And it involves a broad array of rhetorical strategies to help us sound like we know what we're talking about when we're trying to impress others or influence, persuade others. We gave participants in our research studies, things that don't even exist, like a fictitious disease, a fictitious animal. And they were more than ready to tell us how to avoid the disease, how to treat the disease, how to detect it, or how to feed that pet, how to care for the pet, what they probably eat, and things like that. None of it could possibly be true because the objects that we gave them were fictitious.
What was remarkable to us in our early research studies. Even when we told people, no, you do not have to provide an opinion, they were readily willing to provide some thoughts on things that they had no obligation. The only time where people were not so willing to engage in bullshit, is when they expected the audience to evaluate them in some way, whereby the audience has more knowledge than they possessed.
So if you expect to speak with an expert, like in my case, I would never bullshit an auto mechanic because I don't know enough about cars and they're going to know right away that I don't know what I'm talking about. So in that case, if I'm not obligated to know anything and I would otherwise be caught in my bullshit, that's really the only condition that people seem to refrain from it.
Any other situation, gloves are off, people are willing to provide opinions about things they really know nothing about.
Andy Earle: So is our default to bullshit rather than to say, I don't know about that or I'm not sure about that?
John Petrocelli: I don't know if it's the default. It appears to be. I don't have any... I'm not going to bullshit you.
We don't have evidence yet. We are actually collecting data on some personality traits, characteristics that are associated with bullshitting. And it appears likely that people who are a little more extroverted or what we call social monitors, people who are able to fit any situation, when the situation needs a leader they become a leader, when it needs a follower they need to follow, when it needs comic relief they're the comedian, so these are high self monitors, and extroverts appear to be more willing to provide whatever they think the situation...
Andy Earle: Say whatever needs is to be said in this situation to get it out of whatever it's stuck in, or get it to the next phase or
John Petrocelli: Yes.
Andy Earle: Alleviate tension or whatever.
John Petrocelli: Yes, and bullshit is also helpful even just to kill awkward silence. You'll even hear people say we're just shooting the shit or, we're just bullshitting.
It's just a way to fill the air. There's at least three dozen motivations for it. We haven't mapped them all out empirically yet, but I think there's got to be at least three dozen reasons to bullshit. Sometimes people just want to know what it feels like to say something, and they also are interested in what people's reactions are to it.
So that's another motivation for bullshit. Or sometimes it just feels better to say things that you hope or wish to be true and you don't have any concern for the truth.
And the other thing I forgot to mention though about bullshit is often times people believe their own bullshit. The liar doesn't actually believe what they're trying to lead us to believe.
And so it's a lot easier to probably get away with convincing bullshit because what they would teach you if we had a course 101 in lying, we would teach you try to believe the lie, right? And you'll have all the non verbals and you'll say the right things if you actually pretend that it's true.
But with bullshit, you don't have that burden, because you don't care about the truth. So think of how easy it would be to lie if you didn't have the burden of knowing the truth. It would be easy. And so bullshitting is easy, and by definition, there's more of it in our social atmosphere than there is outright, intentional deception, which is lying.
Andy Earle: So do you think that as parents, we should be concerned about being able to tell when our teen is bullshitting us or being tuned into that? Or is it benign, and it's not really malicious, and we shouldn't really be worried about it?
Or where should we fall on that?
John Petrocelli: Every parent who actually listens to their teens will be able to better detect when they're lying or when they're bullshitting.
But I think simply just asking, is that true? Let me think about what they just said. What exactly is the claim that is being made? Do they know that it's true? Or why might they be telling me this? What might be the agenda behind the statement?
But the best kind of question to ask hands down, and it's very unlikely to be spontaneously asked. You have to consciously think about it. Is why might the claim be wrong? There's a strong tendency for people to believe, at least initially.
Someone provides you with a statement. Initially, there's a belief in the claim, just as far as to comprehend the claim. And then you have to do the work, you have to go back, and then you have to correct it if it's not true. And there's a whole line of research on this, and we call it the truth default theory, where the default is to believe that anything you process mentally is true, at least initially.
So if you don't have the mental resources to listen carefully, and then follow up with questions, you can really be thrown for a loop in what you end up believing. Because we also know from volumes of reported research that people tend to believe things that they already want to believe. We call this the confirmation bias.
I want my teenager to make good decisions. I want them to have functional judgments. So I might be looking for things that are functional, right? And so if I go into it with this bias, only looking for evidence that confirms my theories, I have very little chance in detecting their bullshit and their lies.
But I actually think, Andy, it would be much more useful if parents would help their children learn how to better detect BS. And teach them the types of questions that they should be asking. And pretty much any domain has its own sort of general questions.
If you're looking for a new car, or maybe a used car, there's a whole area of idiosyncratic bullshit that people are likely to be exposed to. Teenagers would be better off if they became familiar with those basic questions. And that's part of the reason why I wrote the book.
There was an 11th grade AP English Language and Composition course that I was made aware was actually using the book for the course, because a major component of the course is critical thinking.
And I contacted the teacher and I said, how did you get the book approved? And she said the AP board no longer permits banned books. 11th and 12th grade high schoolers taking AP courses, they're supposed to be reading the very same things that they would read in a college course. And so I've got a lot of great feedback and emails already. There seems to be a little bit of fanfare among students and teachers that enjoy the book.
Because really, Andy, the book is a critical thinking skills book at the end of the day. But nobody wants to read a book called Critical Thinking Skills 101, right? But a book on bullshit sounds a little more interesting. But it's really designed to help people become aware that they're susceptible.
Because that's probably one of the biggest problems with bullshit is we all think we're really good at detecting it. But there's not a single study that supports that conclusion.
Most people are poor at detecting BS. And actually the people who are most confident in their ability are the most susceptible to it. So that's one of the things I try to preach. I'm just a susceptible as anyone else.
But if I have the right tools in questioning and stopping and saying, let me think about what I just heard. What sort of evidence is there for this idea? Is there any readily available evidence for it? Why might someone be telling me this?
Why might the claim be wrong? These kinds of basic questions are wonderful tools to better detect and better dispose of this unwanted social substance that plagues our judgments and decisions.
Andy Earle: There's also times that we might be more likely to accept bullshit without question in terms of the situation. When we're feeling stressed out or emotional or overwhelmed or something like that. We don't do the extra effort to really think critically about the claims that are being made or ask those questions that you're talking about.
Is there any tips or strategies on how to catch that or or avoid that trap?
John Petrocelli: Yeah, I would say don't make any important decisions when you're fatigued, when you're still processing important information, you're deliberating.
It's not a good idea to form judgments and decisions when you're mentally fatigued. Take some notes now and think about it another time.
Also, one thing that's probably a major source of BS is frankly from so called experts. People tend to trust experts. Do a little bit of background on what is really this person's expertise. Are they speaking within their area of expertise? Are they reaching out?
Are there any reasons why they wouldn't tell me the truth? Is there any reason why the used car dealer might be telling me that, oh, yes, this car will make it another 100, 000 miles. Are there any ulterior motives there?
When you're forming judgments about something of importance, take a step back, look at the readily available information. If there's not enough, then search for it. False information actually has a negative effect on my basic judgments and decisions in life.
Andy Earle: Tied to that idea of expertise, or that we tend to just believe people if they're in a position of expertise makes me think about just how many people we believe because they have lots of followers.
They must know what they're talking about. If they have a million followers on social media and they're a psychologist or a doctor and whatever, all these people are listening to what they say and liking their posts. Gotta be true.
John Petrocelli: One of my favorite examples of what you're talking about happened a couple of years ago with YouTube star Logan Paul.
He completely bullshitted his way into a multi million dollar fight with an undefeated superstar boxer, Floyd Mayweather, right? And what's interesting is Logan Paul had no chance, right? But he's got 50 pounds on Lloyd Mayweather and he's a half foot taller. And he's much younger, Mayweather was 44 at the time.
I think Logan Paul landed a few punches, but right after the match, that clearly lost, he said, I feel like I'm the winner. Because I went the distance.
So Floyd Mayweather is a sort of peekaboo style boxer, defensive style boxer. And Logan Paul went eight rounds with him. Most of Mayweather's fights were 12 rounds. Only from Logan Paul's perspective he still the winner.
But this is a case that I think shows where bullshit for the individual can be quite useful, quite profitable, but it wouldn't be a good idea, I think, to follow in Logan Paul's footsteps, and try to fight Floyd Mayweather or Mike Tyson or any of These professional boxers are willing to do these kind of promotional fights right now.
The things that you know that we see through social media, mainstream media they're really they're worth questioning.
But I think the most important thing is just to be aware that we're all susceptible to this sort of general overconfidence, that not only do we know what we're talking about, but that we can readily detect BS. And it's a little more complicated. It's a little more mentally involving than just saying, oh, yeah, I know it when I see it. It really involves an attitude of skepticism and the right kinds of critical thinking questions. Who's telling me this? Why? What's their expertise? What agenda do they have? How do they know it's true?
Why might it be wrong?
Andy Earle: John we are running out of time here, sir. Thank you so much for coming on the show and speaking with us about the life changing science of detecting bullshit. Fascinating work that you do. Super interesting book. There's so much in here that we did not get into.
So many interesting examples, ideas, concepts, studies, stories. So I would highly encourage people to pick up a copy.
John Petrocelli: Oh, yeah. Thank you, Andy. Thanks for having me.
Andy Earle: Where can people go to find out more about your work or to follow updates from you or learn about what you're doing?
John Petrocelli: The best place probably is just my website.
It's johnvpetrocelli. com. And I'm at Wake Forest University, Department of Psychology. I'm easy to find. I have everything posted there. All of our latest findings. That's it.
Andy Earle: Excellent. Thanks again for coming on the show and really hope people find it interesting.
John Petrocelli: All right. Thank you.
Andy Earle: We're here today with John Petrocelli talking about how to know when your teenager is bullshitting you. And we're not done yet. Here's a look at what's coming up in the second half of the show.
John Petrocelli: And actually one of the biggest bullshitters that we're exposed to is bullshit coming from the self. And believing your own bullshit is not functional.
One of the things I tried to do with the book is showing how fun it can be to detect bullshit, especially from so called experts.
It could be something on television, or maybe even something a teacher said, and maybe the claim is generally correct, but maybe there are conditions under which it's not correct. And if anything, the opposite is true.
Andy Earle: And that's another thing you talk about towards the end of the book is when we notice bullshit and we're sure that's bullshit, how do we call people on it?
Or what does that process look like?
John Petrocelli: What bullshitters will often do is they'll take a few steps backwards and they'll be like, Okay what I really mean is... and so they're already cleaning it up, so you're already exposing yourself to less bullshit, right?
Andy Earle: Want to hear the full interview? Sign up for a subscription today. It's completely affordable and your membership supports the work we do here at Talking to Teens.
You can now sign up directly through Apple podcasts. Thanks for listening, and we'll see you next time.