A new study published in the British Journal of Social Psychology looked at how people who frequently try to impress or persuade others predicts receptivity to various types of misleading information.
“On a basic level, it’s investigating some of the ways that misinformation is spread (intentionally and unintentionally) and evaluated by people when they encounter it,” study author Shane Littrell told us. “I really wanted to scientifically test the old folk notion that, ‘you can’t bullshit a bullshitter,’ because so many people assume that it’s true, simply because they were told that growing up.”
It was an exploratory study so the goal was simply to try to identify some of the psychological factors more common in people who mislead others intentionally as well as factors more common in people who are more likely to fall for it, and hopefully uncover any similarities between the two.
“In other words, ‘what makes bullshitters tick?’ as well as ‘why do people fall for BS?,’” Littrell told us. “Mainly, though, I wanted to see if an intentional sender of misinformation (bullshitter) could also be unknowingly/unintentionally duped by the same types of misinformation that they often spread.”
The existing research suggested that the results of the current study could’ve gone either way. There’s already some research out there on lying that is kind of mixed. Some of it appears to show that prolific liars are a little better at detecting lies than the average person, but other research suggests that prolific liars are just as bad at detecting lies as the rest of us.
“So, that suggested that bigger bullshitters might be at least a little better at detecting BS than the average person,” Littrell told us. “But BSing falls just short of lying (as Harry Frankfurt said), and in previous research we found some psychological and cognitive differences between BSers and liars, so we had reason to believe that what’s true for liars might not be true for BSers.”
Past research by Gordon Pennycook on ‘bullshit receptivity’ has shown that people who are more likely to be duped by bullshit score lower on measures of cognitive ability and analytic thinking skills. Research that Littrell and the team published last year suggests that bigger ‘persuasive bullshitters’ (i.e., they score higher in persuasive bullshitting frequency) also score lower in cognitive ability and analytic thinking. This suggests that bigger BSers might also be worse at detecting BS. Overall, researchers were testing these two competing hypotheses against each other to see which was truer.
“Recent events have highlighted the urgency of gaining a better understanding of the spread of misleading information,” Littrell told us. “Our hope is that this research helps answer important questions related to this growing public conversation. If we’re better able to understand why people spread it and why people believe it, we can hopefully devise some ways of preventing people from spreading it and protecting people against falling for it.”
Researchers looked at the relations between participants’ self-reported frequency with which they engage in both types of BSing (persuasive and evasive), which they call their ‘bullshitting frequency,’ and their ratings of how profound, truthful, or accurate they found pseudo-profound and pseudo-scientific statements and fake news headlines (which is called their ‘bullshit receptivity’).
‘Persuasive bullshitting’ describes when a person exaggerates/embellishes or otherwise stretches the truth about their knowledge, skills, ideas, competence, etc. in order to impress, persuade, or fit in with others. An example might be a person trying to talk intelligently (or in a way that merely sounds intelligent) about topics they know little about so they can hopefully appear smarter or more knowledgeable than they actually are.
‘Evasive bullshitting’ describes using evasive, non-relevant truths in situations where frankness might result in hurt feelings or reputational harm. In other words, the person answers without really answering. Often, the person is trying to protect someone else’s feelings or protect their own reputation (or the reputation of a group they represent).
“The way we measured bullshitting frequency was by presenting participants with descriptions of various social situations in which they might be tempted to start bullshitting and ask them to rate how frequently they engage in bullshitting in their daily lives when they encounter these types of situations,” Littrell told us. “For persuasive BSing, we have them rate how frequently they embellish, exaggerate, or otherwise stretch the truth in situations such as, ‘When I want to contribute to a conversation or discussion even though I'm not well-informed on the topic.’”
For evasive BSing, participants rated items such as, ‘When a direct answer would hurt another person's feelings.’ When we measure bullshit receptivity, researchers presented people with a number of statements, some of them were BS, some of them were real.
“The BS items were all randomly constructed by a computer algorithm, which takes a bunch of New Age or scientific-sounding buzzwords (‘quantum interconnectedness’ or ‘hyper-magnetic refractalization’) and randomly assembles them into statements that are syntactically sound (i.e., nouns, verbs, etc. are all in proper order) but semantically meaningless,” Littrell told us. “One example is: ‘Hidden meaning transforms unparalleled abstract beauty.’ They look like they could be real statements, but if you take the time to think about them, you realize they’re actually meaningless nonsense.”
Researchers also gave people actual, intentionally profound quotes (e.g., ‘No man ever steps in the same river twice, for it is not the same river and he is not the same man,’ which is a quote from Heraclitus). Participants rated all the quotes and this allowed researchers to calculate a score of how well they were able to distinguish BS from non-BS. For the fake news items, researchers gave them five fake news and five real news items in the same format that they would see them on social media. Participants also completed measures of cognitive ability, metacognitive insight, intellectual overconfidence, and reflective thinking.
“Overall, we found that people who score higher in persuasive bullshitting frequency (that is, they are more likely to engage in bullshitting to impress or persuade others) were more likely to be duped by BS statements (i.e., believe that they are profound or truthful) and fake news (i.e., believe that fake news headlines are accurate),” Littrell told us. “And this finding held even after we controlled for their intelligence, overconfidence, and analytic thinking skills."
When researchers tested this further, they found that bigger persuasive BSers seem to mistakenly interpret superficial profoundness, truthfulness, or accuracy cues as signals of inherent or intentional profoundness, truthfulness, or accuracy. In other words, if something simply sounds profound, truthful, or accurate to a persuasive bullshitter, to them that means that it actually is profound, truthful, or accurate. Basically, they have a hard time distinguishing fact from fiction.
Littrell was surprised with the results of the study and explained that by definition, bullshitting is an intentional act; a bullshitter knows that they’re bullshitting when they do it.
“So, I guess I naively thought that this might give BSers some sort of an advantage when evaluating BS, because I thought they would kind of know all the tricks, so they might be better at spotting the signs, or at least I assumed that they would,” Littrell told us. “So, I was not only a little surprised at what we found (that big persuasive BSers were more likely to fall for BS), but I was also surprised at how robust the findings were, in that the association remained even after we controlled for a bunch of cognitive factors that we thought would be responsible for such a finding (e.g., intelligence, overconfidence, analytic thinking).”
Littrell explained that it might not matter how smart or analytically-minded a person is, if they’re a big persuasive BSer, they’re more at risk of falling for BS. Littrell also believes the results provide a better understanding of how misinformation is spread and interpreted by people.
“Past research already tells us that misinformation is spread intentionally and unintentionally, but these are the first results showing that being an intentional spreader of misinformation does not inoculate that person for falling for it themselves,” Littrell told us. “It also helps us understand why this happens, as we now know that people who both spread and fall for BS experience metacognitive deficits that prevent them from being able to tell the difference between actual fact and impressive-sounding fiction.”
Patricia Tomasi is a mom, maternal mental health advocate, journalist, and speaker. She writes regularly for the Huffington Post Canada, focusing primarily on maternal mental health after suffering from severe postpartum anxiety twice. You can find her Huffington Post biography here. Patricia is also a Patient Expert Advisor for the North American-based, Maternal Mental Health Research Collective and is the founder of the online peer support group - Facebook Postpartum Depression & Anxiety Support Group - with over 1500 members worldwide. Blog: www.patriciatomasiblog.wordpress.com