Over the last few months I’ve been devouring books and studies about decision-making. I started with the timeless Thinking, Fast and Slow by nobel laureate Daniel Kahneman, and down the rabbit hole I went.
Learning about sunk costs and ego depletion made me reminisce about past events. The more I read about our psychological hiccups in decision-making, the more I realized I was woefully underdeveloped in something that we do everyday that impacts the quality of our lives and the manifestation of our character.
All of the decisions we make create or influence change, for good or ill.
The first difficult thing to admit is, “I am not as good of a decision-maker as I claim to be.” If you’ve been to a movie, realized within minutes that you hated it, but decided to stay based on the justification that it took you 25 minutes to get there and $12 for a ticket, you didn’t make a smart decision. When I think about simple consequences like this, it makes me wonder about the other kinds of decisions I make in my life.
There is a growing body of knowledge that can help us understand the decision-making process, and in turn, learn to make better decisions.
If we do something everyday, it’s worth the time and investment to learn how to do it well. Here are some resources and insights that I’ve gathered from my studies.
Systems 1 and 2
To begin, we have to start with the psychology of the mind.
As Kahneman said in Thinking, Fast and Slow:
“I adopt terms originally proposed by the psychologists Keith Stanovich and Richard West, and will refer to two systems in the mind, System 1 and System 2
– System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
– System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are main sources of the explicit beliefs and deliberate choices of System 2.”
Chip and Dan Heath in Switch: How to Change Things When Change is Hard, share the metaphor of the mind using references from Jonathan Haidt:
“Haidt says that our emotional side is an Elephant, the Rider holds the reins and seems to be the leader. But the Rider’s control is precarious because the Rider is so small relative to the Elephant. Anytime the six-ton Elephant and the Rider disagree about which direction to go, the Rider is going to lose. He’s completely overmatched.
The weakness of the Elephant, our emotional and instinctive side, is clear: It’s lazy and skittish, often looking for the quick payoff (ice cream cone) over the long-term payoff (being thin). When change efforts fail, it’s usually the Elephants fault, since the kinds of changes we want typically involve short-term sacrifices for long-term payoffs. Changes often fail because the Rider simply can’t keep the Elephant on the road long enough to reach the destination.
The Elephant’s hunger for instant gratification is the opposite of the Rider’s strength, which is the ability to think long-term, to plan, to think beyond the moment.
But what may surprise you is that the Elephant also has enormous strengths and that the Rider has crippling weaknesses. The Elephant isn’t always the bad guy. Emotion is the Elephant’s turf—love and compassion and sympathy and loyalty. That fierce instinct you have to protect your kids against harm—that’s the Elephant. That spine-stiffening you feel when you need to stand up for yourself—that’s the Elephant.”
Consequences versus Identity
Another fantastic book not directly on decision-making but how the processes itself leads to desired change, Chip and Dan Heath in Switch: How to Change Things When Change is Hard, share two different models of decision making:
“[James] March says that when people make choices, they tend to rely on one of two basic models of decision making: the consequences model or the identity model. The consequences model is familiar to students of economics. It assumes that when we have a decision to make, we weigh the cost and benefits of our options and make the choice that maximizes our satisfaction. It’s a rational, analytical approach.
In the identity model of decision making, we essentially ask ourselves three questions when we have a decision to make: Who am I? What kind of situation is this? What would someone like me do in this situation? Notice what’s missing: any calculation of costs and benefits. The identity model explains the way most people vote, which contradicts our notion of the ‘self-interested voter.’
Generally, when we use the word identity, we’re talking about an immutable trait of some kind—such as racial, ethnic, or regional identity. But that’s a relatively narrow use of the term. We’re not just born with an identity; we adopt identities throughout our lives.
Because identities are central to the way people make decisions, any change effort that violates someone’s identity is likely doomed to failure. (That’s why it’s so clumsy when people instinctively reach for ‘incentives’ to change other people’s behavior.) So the question is this: How can you make your change a matter of identity rather than a matter of consequences?”
Coined in 1914 by Austrian economist Friedrich Von Wieser, and anticipated roughly 150 years earlier by Benjamin Franklin when he said, “Time is Money,” the concept of opportunity costs is simple: If you don’t do this, what would you do instead?
A key concept in economics, opportunity costs is about scarcity and choice. As Seth Godin brilliantly said,
“Opportunity cost is the value of the lost opportunity, the benefit of the thing you could have done instead of what you’re doing now. … Opportunity cost is at its most expensive when we miss opportunities. It’s not merely the cost of making a choice, it’s the real penalty of standing by as something important is happening without us.”
The more I understood this concept, the more painful it became doing things that didn’t add value to my life.
When we ask ourselves questions to make a decision, frames get put into place.
When we put up the wrong frames around a problem or question, we close ourselves off from the possibilities that need to be seen and questioned. This is at the heart of why we make bad decisions.
If a store owner loses their lease, they naturally insist on finding another store. That’s a frame, and it focuses our attention only on the possibility of having a physical store. But what if the store owner changed the frame by asking themselves, “What if we were online only?” Possibilities open up.
There’s a great study in Practical Wisdom by Barry Schwartz and Kenneth Sharpe on how framing influences the way people played games:
“In the study, all participants played the same game, but for some, it was framed as the ‘Wall Street Game,’ whereas for others, it was framed as the ‘Community Game.’ What a difference a ‘frame’ makes. Participants were much more likely to defect during the ‘Wall Street Game’ than during the ‘Community Game.’ A similar study was labeled the ‘Business Transaction Study’ for some and the ‘Social Exchange Study’ for others. More cooperation occurred in the second case than in the first. The ‘Social Exchange’ frame, the researchers suggest, induced a motivation for the players to do what was right; the ‘Business Transaction’ frame induced the motivation to get as much money from playing the game as they could.”*
Also worth reading is Daniel Kahneman and Amos Tversky’s literature, The Framing of Decisions and The Psychology of Choice (PDF).
What people who make smart decisions do, then, is make a conscious shift from one frame to another, from one question that’s obvious to one that’s difficult and scary to answer.
* Community Game”: Liberman, V., Samuels, S. M., and Ross, L. (2004). The name of the game: Predictive power of reputations versus situational labels in determining prisoner’s dilemma game moves. Personality and Social Psychology Bulletin, 30, 1175-1185.
Expected values are the value of an outcome multiplied by the chances that it will happen.
Professor Julian Simon explains it in 9 minutes:
I recommend reading my friend Michelle Florendo’s post on how she untangles messy decisions with decision trees.
This is a practice that’s helpful because it helps us understand a decision that we’re going to make. We can leverage decision trees to determine the expected value for decisions that we make everyday, like taking the bus.
Sunk Costs and Loss Aversion
My favorite psychological phenomenon are sunk costs.
In the words of author David McRaney, who explored self-delusion in his fantastic book, You Are Now Less Dumb, said:
“Sunk costs are payments or investments that can never be recovered. An android with fully functioning logic circuits would never make a decision that took sunk costs into account, but you would. As an emotional human, your aversion to loss often leads you right into the sunk cost fallacy. A confirmed loss lingers and grows in your mind, becoming larger in your history than it was when you first felt it. Whenever this clinging to the past becomes a factor in making decisions about your future, you run the risk of being derailed by the sunk cost fallacy.”
What’s worth further understanding is when McRaney said, “your aversion to loss often leads you right into the sunk cost fallacy.”
What does this mean?
He continues by sharing Kahneman’s study (PDF) on loss aversion:
“Kahneman explains that since all decisions involve uncertainty about the future, the human brain you use to make decisions has evolved an automatic and unconscious system for judging how to proceed when a potential for loss arises. Kahneman says organisms that placed more urgency on avoiding threats than they did on maximizing opportunities were more likely to pass on their genes. So, over time, the prospect of losses has become a more powerful motivator on your behavior than the promise of gains. Whenever possible, you try to avoid losses of any kind, and when comparing losses to gains you don’t treat them equally. The results of their experiments and the results of many others who’ve replicated and expanded on them have teased out an inborn loss aversion ratio. When offered a chance to accept or reject a gamble, most people refuse to make a bet unless the possible payoff is around double the potential loss.”
The wisdom is simple, but of course, difficult to exercise: Ignore sunk costs.
Why We Compare When Making Decisions
Did you buy that outfit based on your own personal preference? When you bought that new mountain bike, how did you decide that this was the right one?
By nature, human beings make comparisons in order to determine quality and to ultimately feel confident about their decision. As Dan Ariely said in Predictably Irrational:
“Let me start with a fundamental observation: most people don’t know what they want unless they see it in context. We don’t know what kind of racing bike we want—until we see a champ in the Tour de France ratcheting the gears on a particular model. We don’t know what kind of speaker system we like—until we hear a set of speakers that sounds better than the previous one. We don’t even know what we want to do with our lives—until we find a relative or a friend who is doing just what we think we should be doing. Everything is relative, and that’s the point. Like an airplane pilot landing in the dark, we want runway lights on either side of us, guiding us to the place where we can touch down our wheels.”
A fantastic example of this in action is when Ariely shared the insights about the decoy effect.
“What if you are single, and hope to appeal to as many attractive potential dating partners as possible at an upcoming singles event? My advice would be to bring a friend who has your basic physical characteristics (similar coloring, body type, facial features), but is slightly less attractive (-you).Why? Because the folks you want to attract will have a hard time evaluating you with no comparables around. However, if you are compared with a “-you,” the decoy friend will do a lot to make you look better, not just in comparison with the decoy but also in general, and in comparison with all the other people around.”
How Memories Influence Our Decisions
Whenever I hear rants about the rise in gas prices, it reminds me of an insight from Dan Ariely’s book, Predictably Irrational.
How much does your memory of something influence your decision?
“Here is an illustration of this idea. Consider your consumption of milk and wine. Now imagine that two new taxes will be introduced tomorrow. One will cut the price of wine by 50 percent, and the other will increase the price of milk by 100 percent. What do you think will happen? These price changes will surely affection consumption, and many people will walk around slightly happier and with less calcium. But now imagine this: What if the new taxes are accompanied by induced amnesia for the previous prices of wine and milk? What if the prices change in the same way, but you do not remember what you paid for these two products in the past?
If people had no memory of past prices, the consumption of milk and wine would remain essentially the same, as if the prices had not changed. In other words, the sensitivity we show to price changes might in fact be largely a result of our memory for the prices we have paid in the past and our desire for coherence with our past decisions—not at all a reflection of our true preferences or our level of demand.
I am not suggesting that doubling the price of gasoline would have no effect on consumers’ demands. But I do believe that in the long term, it would have a much smaller influence on demand than would be assumed from just observing the short-term market reaction to price increases.”
It’s worth understanding what intuition is and how often we rely on it to make decisions. It’s also worth realizing that if you make all your decisions on intuition and everything goes well for you, you’re either lucky or gifted indeed.
Kahneman, as always, provides a great definition:
“We have all heard such stories of expert intuition: the chess master who walks past a street game and announces, ‘White mates in three’ without stopping, or the physician who makes a complex diagnosis after a single glance at a patient. Expert intuition strikes us as magical, but it is not. Indeed, each of us performs feats of intuitive expertise many times each day. Most of us are pitch-perfecting in detecting anger in the first word of a telephone call, recognize as we enter a room that we were the subject of the conversation, and quickly react to subtle signs that the driver of the car in the next lane is dangerous. . . . The psychology of accurate intuition involves no magic. Perhaps the best short statement of it is by the great Herbert Simon, who studied chess masters and showed that after thousands of hours of practice they come to see the pieces on the board differently from the rest of us. You can feel Simon’s impatience with the mythologizing of expert intuition when he writes: “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”
But every piece of knowledge about decision-making that you learn can change the way you lead your life. Now, bad movies won’t be the bane of your afternoon and possibilities widen once you change the frames around the problem.