Believing is Seeing

Loading

Persuasion“He who permits himself to tell a lie once,” wrote Thomas Jefferson (in a letter to his nephew, Peter Carr, from Paris, France, 1785), “finds it much easier to do it a second and third time, till at length it becomes habitual; he tells lies without attending to it, and truths without the world’s believing him. This falsehood of tongue leads to that of the heart, and in time depraves all its good dispositions.”

Anyone following the ups and downs of federal and provincial politics would have no trouble believing Jefferson’s words. We often believe – without any evidence to support our belief – that politicians lie, simply because other people say so. But Jefferson’s words are a truism that ranges far further, into faith, into social interaction, into relationships and work. It’s all about persuasion, likability and “social proof” or consensus:

…human beings often make choices about what to think, and what to do, based on the thoughts and actions of others. Simply stated: We like to follow the crowd.

People also tend to say yes and agree with people they like (and who shares areas of similarity).

Robert Levine, writing in The Power of Persuasion, says this shows Jefferson “understood the act of taking a stance galvanizes the belief behind the stance.”

In other words, although the speaker knows it isn’t true, by saying it often and loudly enough, the speaker will come to believe it himself, regardless of the truth. Cialdini’s rules of “social proof” and likability come into play here, too, along with the notion of self-justification:

Cialdini says that we’re more likely to be influenced by people we like. Likability comes in many forms – people might be similar or familiar to us, they might give us compliments, or we may just simply trust them.

If you feel the speaker is “one of us,” likable or a peer, someone who shares some similarity with us, we are more likely to consequently repeat the content. We will then commit to that belief by repeating it often enough ourselves.*

An example is the alleged “chemtrail conspiracy.” The more a person repeats his/her belief in the alleged “cover-up” the more they believe it, and the further from truth and evidence they will travel in their belief. If they appear to outsiders as a likable, believable peer, they will get more credibility for their conspiracy theories, no matter how illogical. Others will share and repeat them. The group effect takes over.**

It’s liked canned laughter, as Cialdini points out. We all know it’s fake; we all know we’re being emotionally and psychologically exploited by the TV producers into performing the way we’re expected to perform (laughing at the jokes). But we do it anyway because we have a herd mentality that is hard to overcome. We laugh because we hear others laughing.

Herd mentality

We also use tactics to convince ourselves by acting out certain roles. Levine adds that, “Public displays are especially self-persuasive.” Standing up in front of others and telling the lie makes it easier to believe in it yourself. This is true whether it’s a blog about a local issue or a video recording before a suicide bomb. As Levine points out: the ritualized act of public display is what matters.

We convince ourselves of the rightness of our words or action by the act of the display. Acting the part makes us believe in the character we are playing (and firm belief in ourselves makes us more likely to be seen as credible or an authority figure).

Eventually, you have repeated it often enough that you no longer can distinguish between lie and truth. In fact, the lie has become your truth, even though to outsiders it remains a lie. Orwell’s ‘blackwhite’ newspeak definition in 1984 was prescient of today’s psychology:

…this word has two mutually contradictory meanings. Applied to an opponent, it means the habit of impudently claiming that black is white, in contradiction of the plain facts. Applied to a Party member, it means a loyal willingness to say that black is white when Party discipline demands this. But it means also the ability to believe that black is white, and more, to know that black is white, and to forget that one has ever believed the contrary. This demands a continuous alteration of the past, made possible by the system of thought which really embraces all the rest, and which is known in Newspeak as doublethink.

The true believer, having taken the first step – Jefferson’s lie – then convinces himself or herself if the rightness of that step, outside of any external force or input. But then others repeat that lie, assuming it for themselves because – coming from a likable peer – it reinforces their pre-existing beliefs, so the lie becomes a general “truth” for a wider body.

The more people repeat it, the more a lie becomes their collective “truth.”***

“I’ll believe it when I see it,” becomes “I’ll see it when I believe it,” as Levine points out. Once we’re convinced to believe, we clearly see the conspiracy that was never there before. And the more we act it out, the more we believe.

Conspiracy theories

This can also play into another Cialdini rule: scarcity. We all want more of what we can have less of. If the beliefs are presented as the rare truth only a few can have – even if demonstrably false – then we believe more fervently because we feel we’re getting more of a scarce product unavailable to the masses.

“Whatever is rare, uncommon or dwindling in availability – this idea of scarcity – confers value on objects, or even relationships,” says Cialdini.

Little by little, we commit to the lie, to the belief, to the conspiracy theory. Levine concludes:

We’re compelled to justify our commitments. If there’s no justification in sight—that invisible umpire, again—you’ll look to your own motives for an explanation. There lies the biggest problem of all: once the process begins, it becomes self-perpetuating. If I did it, I must believe it. And if believe it, I’m more likely to do it again, and more so.

Robert Cialdini, in his book, Influence: The Psychology of Persuasion, writes in his chapter on “Social Proof”:

The greater the number of people who find any idea correct, the more the idea will be correct… Convince and ye shall be convinced… Without question, when people are uncertain, they are more likely to use others’ actions to decide how they themselves should act… we are more inclined to follow the lead of a similar individual than a dissimilar one.

Cialdini later underscores that point. We’re motivated by the herd mentality:

Simply get some members moving in the desired direction and others – responding not so much to the lead animal as to those immediately surrounding them – will peacefully and mechanically go along.

In other words, we flock like geese, flying in the same direction as our peers, or buffalo stampeding together, without actually thinking of the direction we’re taking. We do it because those we recognize as our social peers, our influencers, are also doing it.

Being proven wrong – telling geese they’re flying east instead of south, for example – doesn’t make people change their minds, because everyone’s on the same flight path. The flight path becomes the truth and the direction is ignored.

Both men document many instances of cults where predictions about oncoming apocalypses or alien arrivals have proven false. But the believers, rather than being disillusioned, stay on, finding another way to justify their commitment to the cause, altering their beliefs to rationalize the failure rather than giving them up.

In fact, as Levine points out, being proved wrong usually strengthens belief, rather than alters it:

Another consequence of cognitive dissonance is that a belief may actually get stronger when it’s proven wrong. The more you stand to lose, and the more foolish you look, the greater the dissonance and, so, the greater the pressure to prove you were right in the first place. In other words, if you’re in the business of mind control, sometimes nothing succeeds like failure.
Say, for example, you’ve been a dedicated member of a group and are now confronted with evidence that your group’s cause is just plain wrong. Would you admit that you made a mistake and leave? If you’d already committed enough, probably not.

So pointing out an error in belief will not change the true believers’ minds.

It doesn’t matter that the statement, “Despite stating Town residents would have final approval on plans to spend the funds from the sale of 50% of Collus to Powerstream, Council has no plans to obtain any approval from the residents,” can be proven wrong. Nor will it help to challenge such erroneous statements as, “Current Council does not believe it has to get competitive pricing for multi-million dollar projects” with evidence to the contrary.

Facts that counter preconceived ideas will not make the true believers see the light: they will only make them dig in their heels to deny the truth even more. They continue flying east instead of south.

Facts just make some people respond with more vehemence. There’s an old adage:

When the law is on your side, argue the law. When the facts are on your side, argue the facts. When neither the facts nor the law are on your side, make an ad hominem attack.****

Cialdini’s principle of consistency makes it difficult for people to change their message even when facts contradict them. They’re committed to flying east with the other geese.

“Consistency is a principle that asserts that people want to be and to be seen as consistent with their existing commitments,” says Cialdini, the Regents’ Professor of Psychology and Marketing at Arizona State University and Distinguished Professor of Marketing in the W. P. Carey School. “Those commitments can be things they’ve either said or done in the past, especially in public, that give them a position or a stand on some issue.

“I think there are two factors behind consistency… One is the desire to be consistent with what we’ve already done. If you see yourself doing something, it’s only in keeping with what you’ve already done, to do something that is likewise congruent with those actions. We like to be consistent. The second thing related to this is, when you see yourself doing even a small act in favor of a particular cause or issue, you come to see yourself as somebody who actually does favor this idea.”

And, as he notes in his book, “The older we get, the more we value consistency.” It’s not the right or wrong of the belief, but rather that it gets consistently maintained.  Inconsistency creates cognitive dissonance, causing stress. This forces people to find some rationalization or self-justification to relieve the tension (as Levine points out, usually by ignoring the contrary facts that caused the dissonance).

Cialdini says the best defence against the herd response is to ‘recognize when the data are in error.” That’s more easily said than done, in part because our initial reaction to contrary data is defensive or rationalization. We don’t go against the herd.

Conformity

Both Levine and Cialdini describe the situation of a person having committed to a belief, finding it difficult to accept even a single change because it would mean the entire belief system collapses. They’ve committed themselves for so long to so many wrong ideas, that to admit even a single one is wrong brings down the whole house of cards.

“The more one endures,” writes Levine, “the greater the need to self-justify.”

Cialdini later refers to “pluralistic ignorance” and warns, “The consequences of single-minded reliance on social evidence can be frightening.” Writing about situations where the “social evidence” has been “purposely falsified,” Cialdini continues,

Invariably these situations are manufactured by exploiters intent on creating the impression – reality be damned – that a multitude is performing the way exploiters want us to perform.

Making a claim that “The citizens of Collingwood have expressed growing dissatisfaction with the lack of good governance demonstrated by Town Council over the last two years.” is like the canned laughter track on a TV sitcom: trying to exploit the herd mentality.

Once you understand the psychology behind influence and persuasion, it’s a lot easier to avoid flying east with the other geese, but it’s tough to change course once you take wing. Better to use a bit of protective skepticism when evaluating anyone’s claims before you take off.

[youtube=www.youtube.com/watch?v=cFdCzN7RYbw]

~~~~~

Lying has always been an effective tactic because people want to believe that which reinforces their existing perceptions. When the Republicans rail on about President Obama being a “socialist” the absurdity of the claim doesn’t matter, nor the fact it is a lie. GOP believers have a preconception of what they think a socialist is and how a socialist behaves – usually light years from the truth – and they willingly eschew facts and truths to believe what they already had in mind about Obama and Democrats.

The shrill jeremiads of Ann Coulter, that harridan of Republican disingenuousness, no matter how wildly improbable – let alone racist, myopic, sexist, or simply plain stupid – have a staunch following of right-wing fundamentalists because she plays to their existing notions about race, gender, and other issues. They believe her even as she spins further and further from reality – and sanity – because having bought in so far, to give up now would be to raise self-doubt, to question the long chain of acceptance they have made with her so far.

** Both Cialdini and Levine include in their books a lengthy analysis of Jim Jones and Jonestown as textbook examples of crowd behaviour and persuasion taken to the extreme. They are very interesting, disturbing reads about how the process can escalate from something simple to a suicidal cult. Fortunately, Jonestown was the exception to most of the rules, but many political, social and religious groups engage people using essentially the same rules, tactics and psychology.

*** Cults work this way. Levine documents how cults build the belief structures of their members by little steps: believe just a little bit today, then a bit more tomorrow, then a bit more the next… if the member ever realizes how far he or she has strayed for social norms, or from empirical realities, it if often too late to turn back. The belief chain, built on an accumulation of lies, is too strong to break.

You see it, too, in internet conspiracy theories. It often begins with just a small suspicion that the government – any, every government – is hiding something. Not telling us the whole truth. If they won’t give us the truth about Senate expenses, maybe they’re lying about chemtrails too. UFOs. Vaccinations. Creationism. Bigfoot. The Moon landings.

Hucksters and charlatans take advantage of this. Once we’re suspicious of “the government” we’re more prone to accept things that the government doesn’t like, doesn’t support, ignores or bans. If the government thinks homeopathy is a bundle of codswallop, the hucksters tell us it probably means the opposite: it’s good for you. If the police refuse to be sucked into the con of using a “psychic” to solve a case, then the charlatans say it’s because “psychics” are the only ones who know the truth.

All it needs is the carnie to excite the crowd with the lie. And once the crowd starts repeating it, the crowd believes it, too.

**** You may have heard something similar on The West Wing (S1-E3) when the president ended it by saying “when neither the facts nor the law are on your side, bang your fist on the table as loud as you can.”

 

 

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to Top