Recently, I purchased an iPad Air.
It is the first tablet I have ever owned. Ever since the first generation I’ve always wanted one. Whenever I went into the Apple Store, it was the first thing I picked up and played with. If I ever went to a friend’s house, I would ask to play around with it, to see what my favorite sites looked like, and to experience an eBook on something bigger than my phone and not nearly as wide as my laptop.
But the process — how I justified my purchase — of whether or not to drop $500 on yet another Apple product is what alarmed me. “I’ve never owned one, I definitely deserve it,” I said. “I can finance one for, like, $30 a month! I will read more eBooks if I have an iPad! I can get more reading done and have more notes for my commonbook place!”
Although some of these factors may be valid, I caught myself justifying my decision to ultimately make myself feel better about spending $500 on something that, well, wasn’t really a necessary purchase. I could have done a million things with $500, and yet I went with through with it.
What does it say about how we justify our decisions, and more importantly, our foolish beliefs, negative attitudes, ill-advised career choices, and more?
I recently devoured a book called Mistakes Were Made (But Not By Me) written by remarkable social psychologists Carol Tavris and Elliot Aronson. It’s one of my most favorite psychology books this year because fleeing from reality is a topic that often stirs my mind — why do we do it if we know it’s bad for us in the long run, how are we hardwired to do it, and what can we do to stop it? Reading this brilliant and insightful book made me take a step back and view myself through the lens of self-awareness. How often do I justify my mistakes? Why is it so hard owning up to them? What beliefs do I hold onto that are, in fact, delusion?
To a great extent, we humans are great at fleeing from reality. We want to ignore the facts, shun any opposing criticism or feedback, and we hold dearly onto our beliefs, attitudes, and choices all for the sake of, in the words of Donnie Brosco, being seen as a fugazi. When our minds are made up, that’s it — me or nothing. In short, we’re only human.
Tavris and Aronson explain how our need to self-justify is indeed a natural part of being human [emphasis by me]:
As fallible human beings, all of us share the impulse to justify ourselves and avoid taking responsibility for any actions that may turn out to be harmful, immoral, or stupid. Most of us will never be in a position to make decisions affecting the lives and deaths of millions of people, but whether the consequences of our mistakes are trivial or tragic, on a small scale or a national canvas, most of us find it difficult, if not impossible, to say, “I was wrong; I made a terrible mistake.” The higher the stakes — emotional, financial, moral — the greater the difficult.
We see this in politics like The Watergate Scandal, in business, family dilemmas, education, police work, law, etc.
But not all self-justification is bad. Because it’s a human thing to do, it’s both good and bad, has ups and downs. Tavris and Aronson explain it — as they do throughout the entire book — perfectly [emphasis by me]:
Self-justification has costs and benefits. By itself, it’s not necessarily a bad thing. It lets us sleep at night. Without it, we would prolong the awful pangs of embarrassment. We would torture ourselves with regret over the road not taken or over how badly we navigated the road we did take. We would agonize in the aftermath of almost every decision: Did we do the right thing, marry the right person, buy the right house, choose the best car, enter the right career? Yet, mindless self-justifcation, like quicksand, can draw us deeper into disaster. It blocks our ability to even see our errors, let alone correct them. It distorts reality, keeps us from getting all the information we need and assessing issues clearly.
So at this point you may be asking, what is the source of self-justification? What causes us to flinch and flee from reality? Two words: Cognitive dissonance. Tavris and Aronson explain what fuels this engine [emphasis by me]:
The engine that drives self-justifcation, the energy that produces the need to justify our actions and decisions — especially the wrong ones — is an unpleasant feeling that Festinger called ‘cognitive dissonance.’ Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as ‘Smoking is a dumb thing to do because it could kill me’ and ‘I smoke two packs a day.’ Dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it.
The last part of the last sentence is absolutely key: “…people don’t rest easy until they find a way to reduce it.”
If you’ve ever done anything bad — let’s take a moment of honesty now and bow our heads and admit it — we’ve all been a participant in the dance of self-justification. Say a friend calls you but you don’t feel like answering because you know they’re going to ask you to come out tonight but you really don’t want to. Instead of answering and being honest, and all the while contemplating whether to answer or not, it goes to your voicemail. Feeling bad — because, hey, you’re an honest, great friend and always have been — you justify your decision by saying, “Well, I’m really tired; they’ll understand. Well, they’ve done it to me before.”
So here’s how the cognitive dissonance looked in my particular situation regarding the purchasing of an iPad Air:
Me: “An iPad would be nice, but I don’t really need it; it’s just a luxury, a gift for myself. I should save the money. I have an iPhone and a Macbook. Besides, I love print books anyway.”
After purchasing it.
“I never owned a tablet, and I really like the way they feel and look — my goodness this thing is light. I really waited a long time and I think I deserve it. For every generation that came out, I told myself I would get it but I didn’t. Besides, $30 a month for 15 months isn’t all that bad. I’ll read a lot more and have great notes to fill my common book place.”
Now don’t get me wrong, I love my iPad and read a lot on it. But do you see how this all works? It’s better to own up to the mistake or choice, I think, than to bury yourself in delusions in order to flee from reality and responsibility. It doesn’t help in the long run; it ruins your character.
Tavris and Aronson offer some timeless advice on being aware of our need to justify our purchases — like I did — and more importantly, to not listen to our friends or testimonials when it comes to buying a service or product [emphasis by me]:
The more costly a decision, in terms of time, money, effort, or inconvenience, and the more irrevocable its consequences, the greater the dissonance and the greater the need to reduce it by overemphasizing the good things about the choice made. Therefore, when you are about to make a big purchase or an important decision — which car or computer to buy, whether to undergo plastic surgery, or whether to sign up for a costly self-help program — don’t ask someone who has just done it. That person will be highly motivated to convince you that it’s the right thing to do. Ask people who have spent twelve years and $50,000 on a particular therapy if it helped, and most will say, ‘Dr. Weltschmerz is wonderful! I would never have found true love [got a new job] [lost weight] if it hadn’t been for him.’ After all that time and money, they aren’t likely to say, ‘Yeah, I saw Dr. Weltschmerz for twelve years, and boy, was it ever a waste.’ If you want advice on what product to buy, ask someone who is still gathering information and is still open-minded. And if you want to know whether a program will help you, don’t rely on testimonials: Get the data from controlled experiments.
We humans are adamant creatures. It’s hard to change someone’s mind about eating at a specific restaurant or going through with a tough career choice or to finally dump that idiot of a partner, but evermore difficult to change our own. Neuroscientists have now been able to examine why the mind is so hard to change. Tavris and Aronson provide the following:
Neuroscientists have recently shown that these biases in thinking are built into the very way the brain processes information — all brains, regardless of their owners’ political affiliation. For example, in a study of people who were being monitored by magnetic resonance imagining (MRI) while they were trying to process dissonant or consonant information about George Bush or John Kerry, Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored. These mechanisms provide a neurological basis for the observation that once our minds are made up, it is hard to change them.
Let’s look at a prime, fascinating example regarding two tribes in Sudan, the Dinka and Nuer. If you ever visit these tribes, you may notice that everyone is missing their front teeth. Why? Tavris and Aronson explain the neuroscientists findings on when our minds are made up it’s very difficult to change them and, more importantly, how self-justifcation keeps us holding onto foolish beliefs even when all the evidence in the world is pointing to the opposite direction:
Anthropologists suggest that this tradition originated during an epidemic of lockjaw; missing front teeth would enable sufferers to get some nourishment. But if that were the reason, why in the world would villagers continue this custom once the danger has passed?
A practice that makes no sense at all to outsiders makes perfect sense when seen through the lens of dissonance theory. During the epidemic, the villagers would have begun extracting the front teeth of all their children, so that if any later contracted tetanus, the adults would be able to feed them. But this is a painful thing to do to children, especially since only some would become afflicted. To further justify their actions, to themselves and their children, the villagers would need to bolster the decision by adding benefits to the procedure after the fact. For example, they might convince themselves that pulling teeth has aesthetic value — say, that sunken-chin look is really quite attractive — and they might even turn the surgical ordeal into a rite of passage into adulthood. And, indeed, that is just what happened. ‘The toothless look is beautiful,’ the villagers say. ‘People who have all their teeth are ugly: They look like cannibals who would eat a person. A full set of teeth makes a man look like a donkey.’ The toothless look has other aesthetic advantages: ‘We like the hissing sound it creates when we speak.’ And adults reassure frightened children by saying, ‘This ritual is a sign of maturity.’ The original medical justification for the practice is long gone. The psychological self-justification remains.
It’s shocking how far we’ll go to make ourselves feel better about ourselves, our beliefs, attitudes, choices, culture, etc. Even with empirical evidence and scientific backing, we fail and are frightened to simply say, “Hmm, you may be right. I may have been looking at this the wrong way. I made a mistake.” Instead, we’ll just keep digging in hopes that we can hide enough so that people won’t see us as a fraud.
I can go on and on with this subject; I have endless notes on this topic as I find it highly important to fully grasp and understand how our minds work and, more importantly, how they work in situations where we seek to reduce dissonance so that you can be self-aware and pop your own bubble of self-justification.
The panacea to our seemingly automatic reflex of justifying our foolish beliefs and mistakes lies in self-awareness, focusing on principles over moods, and surrounding yourself with people who are willing to say that you’re wrong. Tavris and Aronson admonish [emphasis by me]:
We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice. This starts a process of entrapment — action, justification, further action — that increases our intensity and commitment, and may end up taking us far from our original intentions or principles.
A richer understanding of how and why our minds work as they do is the first step toward breaking the self-justification habit. And that, in turn, requires us to be more mindful of our behavior and the reasons for our choices. It take times, self-reflection, and willingness.
In our private relationships, we are on our own, and that calls for some self-awareness. Once we understand how and when we need to reduce dissonance, we can become more vigilant about the process and often nip it in the bud; like Oprah, we can catch ourselves before we slide too far down the pyramid [There was a story included by the authors about Oprah and how she promoted a book that was actually a fraud. She adamantly stood by it, saying that the message resonated with her, but later on she dedicated an entire show apologizing for her need to self-justify her foolish mistake. Someone in her circle probably pulled her out of her bubble and let her become aware of what she was actually doing and made her realize that if she continued to justify this belief, it would cause harm in the long run.] By looking at our actions critically and dispassionately, as if we were observing someone else, we stand a chance of breaking out of the cycle of action followed by self-justification, followed by more committed action. We can learn to put a little space between what we feel and how we respond, insert a moment of reflection, and think about whether we really want to buy that canoe in January, really want to send good money after bad, really want to hold on to a belief that is unfettered by facts. We might even change our minds before our brain freeze our thoughts into consistent patterns. Becoming aware that we are in a state of dissonance can help us make sharper, smarter, conscious choices instead of letting automatic, self-protective mechanisms resolve our discomfort in our favor.
They explain how Abraham Lincoln was the epitome of this:
“We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifictions and yank us back to reality if we veer too far off. This is especially important for people in positions of power. According to historian Doris Kearns Goodwin, Abraham Lincoln was one of the rare presidents who understood the importance of surrounding himself with people willing to disagrees with him. Lincoln created a cabinet that included four of his political opponents, three of whom had run against him for the Republican nomination in 1860 and who felt humiliated, shaken, and angry to have lost to a relatively unknown backwoods lawyer…”
Mistakes Were Made (But Not By Me) is an alarming and equally insightful read on one of the most profound and regularly exercised functions of our human brain. From today on, we will continue to make mistakes, justify foolish beliefs and attitudes, and do whatever we can to avoid being seen as a fraud — fleeing from reality, after all, feels much better than looking someone in the eye and saying, “Mistakes were made.”
Abiding to principles instead of ephemeral moods, being self-aware, and surrounding yourself with naysayers is a great start to not allowing this bubble of self-justifcation become so opaque that we become oblivious to what we’re doing, both to ourselves and others. The book provides an endless supply of empirical and shocking research and experiments that will make you wonder if your behaviors are similar to that being discussed. There is just too much that I left out, and I tried my best to encompass the lessons and insight in one post, but alas, it’s best if you picked up the book and studied it on your own. I’m sure anyone who reads this would be compelled to be a better person — and would equally be shocked on how long they have been lying to themselves in the name of safeguarding their ego.