Introduction to behavioral economics

Just days ago, President Obama issued this executive order:

A growing body of evidence demonstrates that behavioral science insights -- research findings from fields such as behavioral economics and psychology about how people make decisions and act on them -- can be used to design government policies to better serve the American people.

So just what is behavioral economics? Well, if you're working on problems that involve people making decisions, you better at least know the basics. (And let's be serious... if a problem doesn't involve human behavior, is it even a problem???)

Let's dive in...

Traditional economic theory is founded upon the concept of homo economicus (economic man). The theory that humans are "consistently rational and narrowly self-interested agents who usually pursue their subjectively-defined ends optimally" (Wikipedia). Given competing options, we'll use the information available to behave in ways that maximize our self interest.

So what's wrong with the homo economicus model?

Well, when we look at the actual choices and behaviors of most people, they don't line up with this model of rational self-interest. In reality, we make some choices well and others quite poorly.

Luckily, new theories are emerging to help us understand these poor choices, homo economicus be damned. Behavioral economics is one of those theories, and the forefathers of it all are Daniel Kahneman and Amos Tversky. Daniel won a Nobel Prize in 2002 for his work on the hidden forces that shape our decisions. Understanding why we make some choices irrationally can be immensely powerful - both for improving our own decisions and the decisions of others.

This week we dive into a number of the biases that shape and skew the choices we make pulling from work by Daniel Kahneman, Richard Thaler, Cass Sunstein, and Dan Ariely. Next week we'll explore ways to apply these insights in complex challenges.

The brain: two systems

Before we dive into these biases, a little background on how our minds work. Daniel Kahneman argues our poor decisions are rooted in complex interactions between two systems in the brain. In Thinking Fast and Slow, he defines the two systems as follows:

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

System 1 is fast and helps recognize what is going on around us, often seeding our memory (both knowingly and unknowingly). System 2 is slower and more deliberate, but often works off information seeded in our memory by System 1. The interplay between these two systems creates surprising, yet predictable results that bias the choices we make.

Cognitive biases

Collectively these biases are called "cognitive biases" -- biases that affect our ability to acquire knowledge and make decisions. We've summarized the ones you're most likely to run into but you should take the time to review them all.

Anchoring

We are heavily biased by where we start. If we're asked to predict the population of San Francisco but are first told the size of a smaller city, we'll reliably underestimate the size. The opposite is true if we're anchored first with a much larger city. This even works with obviously irrelevant anchors (like thinking of the last three digits of your phone number before guessing when Atilla the Hun lived).

Anchoring can occur with a simple re-ordering of questions. Ask someone whether they are happy, followed by how often they date and you'll see very little correlation between the two answers. Flip the order and you'll suddenly get much greater correlation.

Basically, anchors provide starting points that we adjust from - which could work except our adjustments aren't great enough.

Availability

We predict the likelihood of an event based on how readily we can think of examples of that event. An example can be seen in estimates of the relative rates of homicide and suicide. Since we hear about murder more frequently we're inclined to predict that murder rates are greater than suicide. In reality, suicide rates are twice as prevalent in the U.S. as murder.

Since this bias is influenced by how easily we can recall similar events, events that happened recently or that are more graphic introduce an even greater bias on our predictions. Informing people of accurate probabilities can help mitigate this bias (but it's still tough to overcome).

Representativeness

We overweight qualitative characteristics when they match our stereotypes in spite of probabilities that should sway us otherwise. Kahneman shares the following example in Thinking Fast and Slow:

And if you must guess whether a woman who is described as "a shy poetry lover" studied Chinese literature or business administration, you should opt for the latter option.

When something seems to share enough characteristics to lump it in a given category, we over assign additional characteristics we believe to be stereotypical of that category. A great example from Psychlopedia is assuming that "people who wear sandals, ride a bike everywhere, and support liberal causes are vegetarians because they hold enough of the characteristics of your concept of a vegetarian to belong to that group."

Representativeness bias also shows up in events largely determined by chance. We tend to see patterns where there are none because we don't have an accurate understanding of what a real random sequence looks like. Consider the much loved cultural myth of being "on fire" in basketball. In reality, players "who have made their last few shots are no more likely to make their next shot (actually a bit less likely)" (Thaler & Sunstein, 2009).

Social influence and norms

We are strongly influenced by what others do. Norms are the behavioral expectations within a society or group. They have a powerful automatic effect on behavior - sometimes that power comes from penalties inflicted for not following the norm, other times that power comes from the social benefits received from conforming.

Consider a sign in a hotel room asking guests to recycle their towels to help the environment. By introducing the norm that most guests recycle their towels at least once during their stay, a hotel was able to increase the number of guests who complied by 9%.

The social networks we are a part of can play a big role here, creating what looks like contagious behavior or "herding". We imitate the behavior of others in our network, regardless of whether it is in our best interest. This can be seen in higher likelihood for us to become overweight when our friends are fat or shifts in energy usage when we're able to compare our energy bill with those of our neighbors (warning! show someone they use far less energy than others and they are likely to increase their energy usage).

Decoy effect and relative thinking

Our preferences between two options change significantly when presented with a third option that is asymmetrically dominated. What exactly does that mean?

Say you're presented with two options. Option A is a trip to Bali with free breakfast. Option B is a trip to Thailand with free breakfast. By adding an option C of a trip to Thailand without a free breakfast, we'll skew the choice significantly toward option B (much more than if we just had people pick between option A and option B).

Consider another case: when asked whether you'd drive to another store 15 minutes away to save $10 on a $30 book, most will. When presented the same opportunity with a $600 mattress, most won't - even though the amount saved and time required are the same.

Other common biases

  • Status quo. Just like Newton's first law, we tend to stick with our current choices even when it's clearly not in our best interest - just because it is what we've done in the past. People tend to park in the same parking spot at work even when spaces aren't assigned and rarely change the initial asset allocation in their 401(k)s.
  • Framing. How we frame choices matters. When presented with the decision to undergo an intensive surgery, you're far more likely to proceed with the surgery if the odds are framed as "90 out of 100 patients survive" than "10 out of 100 patients die."
  • Loss aversion. We hate to lose. Losing hurts twice as bad as the equivalently sized win. Since we are reluctant to give up what we have, this bias encourages us not to make changes and to reinforce the status quo.
  • Zero price effect. Something that is free produces irrational excitement. Normally we consider both upsides and downsides but when something is free, we forget about the downside. We pay too much when we pay nothing. Free is an incredibly powerful driver of human behavior.
  • Endowment effect. Once we own something, we place a much higher value on it. In some cases, this sense of ownership comes before we actually own it based on how much work we put towards acquiring it (think online auctions).
  • Overconfidence. When asked to predict our performance relative to others, we reliably overestimate. Think of this as the "above average" bias (only 5% of a class expecting their performance to be below the median). This bias encourages individual risk taking (it won't happen to me).
  • Emotions. When we are aroused, angry, frustrated or hungry, we reliably make irrational decisions. In these situations, our behavior is fully controlled by emotions.

Here be dragons...

We don't always make bad decisions. If the decisions we're making are in domains where we have lots of experience, good information, and prompt feedback, we're quite good.

Unfortunately, many of our most important decisions don't fall into these criteria. We're especially bad when faced with decisions we make infrequently, where we don't get immediate feedback, and where we face short-term benefits and long-term costs. Sound familiar?

If you want to make better decisions and help others do the same, I highly recommend picking up copies of Thinking Fast and Slow, Nudge, and Predictably Irrational. They're packed with additional details, examples and guidance on how to avoid these biases and make better decisions.

Remember, something is always influencing your choices. If you can't control your decisions, make sure you at least know what is.