We like to think our brains can make rational decisions — but maybe they can't.
The way risks are presented can change the way we respond, says best-selling author Michael Lewis. In his new book, The Undoing Project, Lewis tells the story of Daniel Kahneman and Amos Tversky, two Israeli psychologists who made some surprising discoveries about the way people make decisions. Along the way, they also founded an entire branch of psychology called behavioral economics.
Lewis is also the author of Moneyball, which is about trusting statistics over intuition to build a successful baseball team. He tells NPR's Audie Cornish what Kahneman and Tversky were looking for, and how the Obama administration has put their findings to use.
On how framing something as a loss or a gain can affect the way people make decisions
If you have a patient in a doctor's office who's just been told they have terminal cancer but there's this operation they could perform right now that might save their lives. ... They have a 90 percent chance of surviving the operation — if you tell them that, they respond one way. If you tell them ... that they have a 10 percent chance of being killed by the operation, they are about three times less likely to have the operation.
If you frame something as a loss — 10 percent chance of dying — as opposed to as a gain — 90 percent chance of living — people respond entirely differently. They make a different decision.
On what Kahneman and Tversky were searching for
I think their central question is, "How does the mind work?" Not when it's in an emotional state, but when it thinks it's in a rational state; when it's faced with a judgement or a decision to make, what's it doing? That's what they were interested in.
On how the Obama administration has put Kahneman and Tversky's findings to use
The Obama administration actually had, and has, a unit in it that's responsible for framing decisions in a way that leads people to maybe make better decisions. This unit, for example, has gone through all the federal pension plans, many of which were opt-in: required a worker to check a box if he wanted to save a certain amount of money. They changed these plans to opt-out: You had to check a box if you didn't want to save a certain amount of money.
Just that change in what's called the "choice architecture" leads people to save massively more money. And that was just one of the things that Danny and Amos' work revealed: the importance of framing.
On applying Kahneman and Tversky's findings to the 2016 election
I filtered the election through Danny and Amos. They did these wonderful unfinished studies about how the human imagination worked. They called it "The Undoing Project." And they studied, briefly, how people undid tragedy.
So after the election, people who found the result tragic were obeying some of the rules of the imagination that Danny and Amos described: focusing on the FBI director, for example, at the end. Danny and Amos had pointed out that when people endure a tragedy and they try to undo it in their minds so that they get to some alternative reality where it didn't happen, they start at the end and they undo the last thing that happened. So they undo the field goal kicker missing the field goal, and they undo the grounder going through the legs of the first baseman in the ninth inning. ...
Whereas, you know, you can think of a thousand different things that could have happened — and that were more probable [than the actions of FBI Director James Comey] — that could have ended with Trump not being president.
On what gave him the idea for his new book
When [Moneyball] was published it was pointed out to me in a review that the mistakes that people make when they're judging other people had been described well by these two Israeli psychologists in the work they'd done in the 1970s. And I was oblivious to it.
This book, if anything, is like the prequel to Moneyball. It explains why experts' intuitive judgments can go wrong and why you need to have data to rely on as a check against the judgments of these experts.