I Told Me So: Self-deception and the Christian life
Consider: Based on surveys…
·
94% of professors think they’re doing an above-average
job.
·
100% of high-school seniors thought they were
above average in terms of ability to get along.
·
60%
estimated they were in the top 10%, and 25% estimated they were in the top 1%.
What
do you make of this? Is it a problem? Is it helpful to know?
It
looks as if we are deluded about ourselves. I do find it pleasant to feel smart by
quoting studies. But is there anything constructive we can do with this
awareness? In I Told Me So: Self-deception and the Christian life, Gregg
Ten Elshof offers help both with understanding the problem and responding
appropriately. I’m especially pleased that the book is oriented toward
Christians. Self-deception can make a farce of spiritual development. I think it's quite rampant in the church, and we’re more likely to listen to correction when
it comes from someone we're inclined to trust.
Here's the
picture that Ten Elshof draws:
I like having a high opinion of myself. Earning the
self-image I crave can be tough. So life offers me a deal: there are ways to
believe I’ve earned that high opinion without putting in the work to actually
earn it. In other words, cheat. Similar deals are available for securing other
kinds of agreeable beliefs. Whatever the payoff, in order to successfully
execute the deal, I can’t catch myself in the act. Most people tend to take the
deal. If I take the deal, it will seem (as it does in fact seem) as if I
hadn’t.
Is it not impressive that this deal works? What
arsenal of self-deceptive strategies makes it possible?
Attention
Management: Filling your consciousness with arguments for what you already
believe (or want to believe), or focusing on tearing down opposing arguments,
yet believing you are seeking to understand the truth of the matter.
Example: Ashley, a Christian, reads lots of
Christian apologetic literature, and loves the arguments she finds there. She
doesn’t read much atheist literature or apologetics from other religions. When
she does, she focuses on finding weaknesses in their arguments. Ashley feels
that through all this reading, she has pursued the truth and come out with a
justified confidence in the rational superiority of the Christian worldview.
Procrastination:
When you believe there is something you ought to do or choose, but would rather
not do it, delaying the choice – often with the effect that later on, what
seemed right at the time becomes easier to ignore.
Example: An organization gives a presentation in
church, and Gregg feels he ought to give to their cause. But instead of giving
there and then, he tells himself he’ll go home and research them online, look
at his finances and then probably give even more generously than he would have
done on the spot. The cause then loses urgency, and he never does that research
nor gives any money.
Perspective
switching: Choosing to see a situation from somebody else’s perspective
when their perspective is more agreeable than your own.
Example: David orchestrates the premature death of
Uriah so he can marry the man’s wife. The deed is played out such that it looks
like Uriah was merely a casualty of war. David chooses to think of Uriah’s
death as more of a convenient casualty than a murder, and goes on living
without feeling morally troubled until Nathan calls him out on what he has done.
Rationalization:
Constructing a rational justification for a behavior, decision, or belief
arrived at in some other way – fictitious because the rational justification
played no causal roll in the behavior.
Example 1: A Christian mortgage broker implicitly
encourages his clients to lie about their income on their applications. The true reason that the
broker really does this is that it is standard practice and seems like the only
way to support his family in his line of work. But he rationalizes that the lie
is really his just client’s responsibility and anyway leads to a win-win
situation for both the client who wants the loan and the bank that wants to
give it.
Example 2: Ed goes to an apologetics class
marketed with the phrase “Find out WHY you believe what you believe”. Really Ed
believes what he does because his authorities told him it was true and it just
feels right in his gut. But subconsciously somehow this doesn’t seem to him like
a good enough reason for belief-- certainly he doesn't want to tell his non-Christian friends that. Ed will convince himself and proclaim to
others that these rational arguments are why he believes the Christian message
despite that in reality they play no causative role in his faith.
Ressentiment:
Changing your feelings, values or judgments to escape coming to terms with a
disagreeable situation. Ten Elshof describes three types of ressentiment:
1.
Scorn for
an unavailable good
Example 1A: Aesop’s fable
of the fox and the sour grapes. (After unsuccessfully trying to grab a
high-hanging cluster of grapes, the fox decides they were probably sour anyway.)
Example
1B: Gregg, who doesn’t have lots of money, drives an old clunker. He praises
the benefits of this situation: he doesn’t have to worry about it getting
scratched, stolen or mistreated. Insurance costs less. He’s less susceptible to
materialistic vanity. He tells himself he really prefers driving the old car.
But when his parents offer to pass on to him a much newer and nicer vehicle, he
happily accepts.
Example
1C: Around the beginning of the 20th century, ideas began being
taught in the universities that many felt threatened the plausibility of the
tenants of Christian orthodoxy, making it more difficult for a lot of
conservative Christians to stay fully engaged in the intellectual scene.
Anti-intellectualism then grew among them, sometimes displayed through outright
distrust of higher education.
2.
Pushing
the unavailable good to the edges of consciousness by super-valuing something
else
Example 2: In
the situation of example 1C, other Christians effectively discredited the value
of the life of the mind by way of emphasizing the importance of faith and the
heart (which of course were appropriate values).
3.
Identifying
an unacceptable sentiment as something else
Example
3A: Ashley is angry with Jennifer, but casts it as being “concerned for her” or
“sad about what she’s doing”. Admitting anger might imply that Ashley hasn’t
forgiven Jennifer, which would be unacceptable as a Christian.
Example
3B: Chris is envious of Mike’s fancy new TV, but spins his feelings as being
concerned that the TV will be unhealthy for Mike and his family.
Perhaps you’ve noticed
that often self-deception is evident on the outside. If I wanted to stop deceiving myself, could it not be as easy as asking for an honest assessment from a friend? Sometimes. But often the people around us are complicit in our self-deceptive strategies.
These tend to be one form or another of groupthink.
Groupthink: Stifling a group’s capacity for critical thinking and
careful decision-making due to a value of conformity or harmony. As defined by
psychologist Irving Janis, groupthink is characterized by eight symptoms: an Illusion
of invulnerability, collective rationalization, a belief in inherent morality (ignoring
ethical consequences because the group’s cause is right) , stereotyped views of
out-groups, direct pressure on dissenters, self-censorship (a reluctance to
voice doubts or reservations), an illusion of unanimity, and self-appointed
“mind guards” (people who keep the leader from being bothered by problematic
information). In Janis’s analysis, numerous US foreign policy disasters were
largely a result of groupthink, including the US failure to anticipate the
Japanese assault on Pearl Harbor, President Johnson’s involvement in the
Vietnam war, and the failed US invasion of Cuba known as the Bay of Pigs
Incident.
Manifestation
1: Corporate Groupthink: Often subordinates wish to ingratiate themselves
to their leader, and this leads to a pressure toward conformity. The conformity
pressure in turn means everyone goes along with the boss’s favorite picture.
The group then backfires: instead of being an engine of critical thinking producing
a fuller picture on which to base better decisions, the group reinforces the
leader’s blindspots, making him more confident in whatever half-baked idea he
might pursue.
Manifestation
2: The Game of Happy Family: Sometimes the members of a family (loosely
defined) cooperate to keep certain things hidden from themselves. Psychiatrist R. D. Laing characterized this group-level deception as employing the
following rules:
Rule A: Don’t.
Rule A.1: Rule A does not exist.
Rule A.2: Do not discuss the existence or
nonexistence of Rules A, A.1, or A.2
Example A: An alcoholic father is abusive to his
family. The family members systematically ignore the evidence of the problem,
and do not hint of its existence to each other. When the problem finally gets dealt with, the victims are astounded at what they experienced but disregarded.
Example B: In a prosperous American church, though tithing
is encouraged, nobody usually raises the question of whether there’s a moral
problem with Christians (who can afford these things) buying nice cars, fine
food, new appliances or making large expenditures on entertainment.
Rule A: Don’t question the moral legitimacy of
buying a new BMW.
Rule A.1: Rule A does not exist.
Rule A.2: Do not discuss the existence or
nonexistence of Rules A, A.1, or A.2
Sometimes a member returning from a short-term
mission trip to a developing country may become troubled about these questions
and wonder why nobody else seems to be. But typically in time they readjust,
become comfortable again and make no long-term changes in their lifestyle.
In contrast,
people of various different social strata often feel morally queasy about the exorbitant
materialistic lifestyle of those in the next stratum up. Jesus’ warnings about having your heart carried
away by wealth may ring true when one imagines those richer people, but not in one’s
own life.
So what can we do about all this? My next blog will explore Ten Elshof's positive suggestions.
No comments:
Post a Comment