Rob Henderson explains some of the reasons smart people can believe dumb ideas:
Many have discovered an argument hack. They don’t need to argue that something is false. They just need to show that it’s associated with low status. The converse is also true: You don’t need to argue that something is true. You just need to show that it’s associated with high status. And when low status people express the truth, it sometimes becomes high status to lie.
In the 1980s, the psychologists Richard E. Petty and John T. Cacioppo developed the “Elaboration Likelihood Model” to describe how persuasion works. “Elaboration” here means the extent to which a person carefully thinks about the information. When people’s motivation and ability to engage in careful thinking is present, the “elaboration likelihood” is high. This means people are likely to pay attention to the relevant information and draw conclusions based on the merits of the arguments or the message. When elaboration likelihood is high, a person is willing to expend their cognitive resources to update their views.
Two paths to persuasion
The idea is that there are two paths, or two “routes”, to persuading others. The first type, termed the “central” route, comes from careful and thoughtful consideration of the messages we hear. When the central route is engaged, we actively evaluate the information presented, and try to discern whether or not it’s true.
When the “peripheral” route is engaged, we pay more attention to cues apart from the actual information or content or the message. For example, we might evaluate someone’s argument based on how attractive they are or where they were educated, without considering the actual merits of their message.
When we accept a message through the peripheral route, we tend to be more passive than when we accept a message through the central route. Unfortunately, the peripheral route is more prevalent because we are exposed to an increasingly large amount of information.
The renowned psychologists Susan Fiske and Shelley Taylor have characterized humans as “cognitive misers”. They write, “People are limited in their capacity to process information, so they take shortcuts whenever they can”.
We are lazy creatures who try to expend as little mental energy as possible.
And people are typically less motivated to scrutinize a message if the source is considered to be an expert. We interpret the message through the peripheral route.
This is one reason why media outlets often appoint experts who mirror their political values. These experts lend credibility to the views the outlet espouses. Interestingly, though, expertise appears to influence persuasion only if the individual is identified as an expert before they communicate their message. Research has found that when a person is told the source is an expert after listening to the message, this new information does not increase the person’s likelihood of believing the message.
It works the other way, too. If a person is told that a source is not an expert before the message, the person tends to be more skeptical of the message. If told the source is not an expert after the message, this has no effect on a person’s likelihood of believing the message.
This suggests that knowing a source is an expert reduces our motivation to engage in central processing. We let our guards down.
As motivation and/or ability to process arguments is decreased, peripheral cues become more important for persuasion. Which might not bode well.
However, when we update our beliefs by weighing the actual merits of an argument (central route), our updated beliefs tend to endure and are more robust against counter-persuasion, compared to when we update our beliefs through peripheral processing. If we come to believe something through careful and thoughtful consideration, that belief is more resilient to change.
This means we can be more easily manipulated through the peripheral route. If we are convinced of something via the peripheral route, a manipulator will be more successful at using the peripheral route once again to alter our initial belief.