Perhaps the single most important characteristic that separates human beings from the rest of the animal kingdom is our ability to process information and make rational decisions.  And yet all too often, our decisions are not as well thought out as we might like.  Often that’s okay; it doesn’t take a lot of analysis to figure out that you shouldn’t walk alone at night in the wrong part of town.  But sometimes it can lead to disastrous consequences.

In my Ph.D. research at Stanford University, I studied how people make decisions, and how they can make better decisions. I also studied how organizational decision-making frameworks can make some companies more entrepreneurial and competitive—and others more bureaucratic and stifling.

Understanding how people make decisions is core to marketing, investing, negotiation, motivation, and executive leadership. It allows us to understand other people’s decisions, as well as improve our own. It helps us understand good process, when process is called for—as well as when to chuck the processes and follow our instinct.  It also helps us recognize and overcome the natural biases and errors that are built into us through millions of years of evolution.

Learn more about our built-in biases

Cognitive Illusion

We are all familiar with optical illusions: we look at a visual pattern, and it tricks us into thinking we are seeing something that isn’t really there. Optical illusions are designed to trick the part of our brains that processes vision. Often magicians can exploit optical illusions to make us “see magic.”

Cognitive illusions are very similar, except that they are tricking the part of our brains that processes information and makes judgments. Cognitive illusions are frequently exploited by marketing departments, lotteries, advertising agencies, and casinos to trick us into thinking we are getting a better “deal” than we actually are.

Judgment Biases

Evolution has programmed us to have certain judgment biases. We make errors in judgment all the time; but some errors don’t matters, while others can be fatal.

We are descended from the cavemen who saw lions that were not there. We are not descended from the cavemen who failed to see the actual lions.

For millions of years our ancestors had to make certain types of judgments: what plants to eat, what animals to avoid, where and when to sleep, with whom to partner or mate, how to nurture and protect our children.  Over time, our brains adapted to make some of these critical judgments more efficient–we were “hardwired” with certain judgments and heuristics.

Now we live in a world with electricity, processed foods, and an environment that looks nothing like the open plains, caves, or jungles of our ancestors.  Many of our hardwired solutions are still important (as any parent knows when their child starts crying).  But some of our judgment biases are easily tricked or confused by our modern environment.

There are dozens of recognized cognitive biases.  And just as we can “see past” an optical illusion when we realize it’s there, we can often “think past” a cognitive bias once we consciously understand it.

Availability Bias

One example of this is the Availability bias.  Our brains are wired to believe that the likelihood of an event is the same as the frequency with which we see the event.  This makes a lot of sense when you are a hunter-gatherer, but not so much sense when you are evaluating news stories on CNN.  In 1989, I was living in the Bay Area when the Lome Prieta earthquake hit.  Although it was scary, it did remarkably little severe damage—a section of a bridge, another section of a freeway, a few houses in one area of one city.  (There was quite a bit more minor and moderate damage, but still less than you might have expected.)  But the news stations around the world showed pictures of the same 5 or 6 fires and collapsed structures over and over, for several days.  The result was that people who did not live in the Bay Area were left with the impression that the entire region had collapsed.  The repeated news articles and pictures were tricking their brains into thinking the destruction was more widespread than it really was.

Understanding and overcoming our own biases, as well as those of the people we interact with, is a huge leap forward toward making better decisions.

Learn about some of the great people I worked with

Ron Howard

My dissertation advisor, Ron Howard, created the field of Decision Analysis as an expansion on the early work in decision theory. He has influenced many generations of students, going back to the 1950s. His early work in Markov models, and his later work in Decision Analysis, have had a huge influence on modern management science, affecting such fields as investing, marketing, strategy, medicine, law, and more.

Amos Tversky

Amos Tversky, the chairman of my orals committee, performed much of the seminal psychology research into how people actually make decisions, and what sorts of cognitive biases and traps are hard-wired into us. His work with Daniel Kahneman eventually won the Nobel Prize in Economics, and led directly to the development of behavioral economics. Amos’ work is widely viewed as among the most influential academic research of the past century.