Thinking in Bets
04 Apr 2021
These are my notes & quotes on ideas and concepts I found interesting from Thinking In Bets. Buy the book →
Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.
No. 5: Malcolm Butler’s game-winning interception in @SuperBowl XLIX (Feb. 1, 2015) @patriots @Mac_BZ #NFL100— NFL (@NFL) September 21, 2019
📺: NFL 100 Greatest Plays on @NFLNetwork pic.twitter.com/Udx7beThvz
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say, “I should have known that would happen,” or, “I should have seen it coming,” we are succumbing to hindsight bias.
Neumann is also the father of game theory. After finishing his day job on the Manhattan Project, he collaborated with Oskar Morgenstern to publish Theory of Games and Economic Behavior in 1944. The Boston Public Library’s list of the “100 Most Influential Books of the Century” includes Theory of Games. William Poundstone, author of a widely read book on game theory, Prisoner’s Dilemma, called it “one of the most
influential and least-read books of the twentieth century.” The introduction to the sixtieth-anniversary edition pointed out how the book was instantly recognized as a classic. Initial reviews in the most prestigious academic journals heaped it with praise, like “one of the major scientific achievements of the first half of the twentieth century” and “ten more such books and the progress of economics is assured.”
The decisions we make in our lives—in business, saving and spending, health and lifestyle choices, raising our children, and relationships—easily fit von Neumann’s definition of “real games.” They involve uncertainty, risk, and occasional deception, prominent elements in poker. Trouble follows when we treat life decisions as if they were chess decisions.
Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.
The quality of our lives is the sum of decision quality plus luck.
Admitting that we don’t know has an undeservedly bad reputation. Of course, we want to encourage acquiring knowledge, but the first step is understanding what we don’t know. Neuroscientist Stuart Firestein’s book Ignorance: How It Drives Science champions the virtue of recognizing the limits of our knowledge. (You can get a taste of the book by watching his TED Talk, “The Pursuit of Ignorance.”) In the book and the talk, Firestein points out that in science, “I don’t know” is not a failure but a necessary step toward enlightenment. He backs this up with a great quote from physicist James Clerk Maxwell: “Thoroughly conscious ignorance is the prelude to every real advance in science.” I would add that this is a prelude to every great decision that has ever been made. What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”
Decisions are bets on the future, and they aren’t “right” or “wrong” based on whether they turn out well on any particular iteration.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing. We are constantly deciding among alternative futures: one where we go to the movies, one where we go bowling, one where we stay home. Or futures where we take a job in Des Moines, stay at our current job, or take some time away from work. Whenever we make a choice, we are betting on a potential future. We are betting that the future version of us that results from the decisions we make will be better off. At stake in a decision is that the return to us (measured in money, time, happiness, health, or whatever we value in that circumstance) will be greater than what we are giving up by betting against the other alternative future versions of us.
It turns out, though, that we actually form abstract beliefs this way: We hear something; We believe it to be true; Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
When someone challenges us to bet on a belief, signaling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us. How do I know this? Where did I get this information? Who did I get it from? What is the quality of my sources? How much do I trust them? How up to date is my information? How much information do I have that is relevant to the belief? What other things like this have I been confident about that turned out not to be true? What are the other plausible alternatives? What do I know about the person challenging my belief? What is their view of how credible my opinion is? What do they know that I don’t know? What is their level of expertise? What am I missing?
In poker, the bulk of what goes on is watching. An experienced player will choose to play only about 20% of the hands they are dealt, forfeiting the other 80% of the hands before even getting past the first round of betting. That means about 80% of the time is spent just watching other people play. Even if a poker player doesn’t learn all that efficiently from the outcomes of the hands they play themselves, there is still a whole lot to be learned from watching what happens to the other players in the game. After all, there is four times as much watching everyone else as there is playing a hand yourself.
As artist and writer Jean Cocteau said, “We must believe in luck. For how else can we explain the success of those we don’t like?”
We think we know the ingredients for happiness. Sonja Lyubomirsky, a psychology professor at the University of California, Riverside, and popular author on the subject of happiness, summarized several reviews of the literature on the elements we commonly consider: “a comfortable income, robust health, a supportive marriage, and lack of tragedy or trauma.” Lyubomirsky noted, however, that “the general conclusion from almost a century of research on the determinants of well-being is that objective circumstances, demographic variables, and life events are correlated with happiness less strongly than intuition and everyday experience tell us they ought to be. By several estimates, all of these variables put together account for no more than 8% to 15% of the variance in happiness.” What accounts for most of the variance in happiness is how we’re doing comparatively. (The breadth and depth of all that research on happiness and its implications is important, but it’s beyond what we need to understand our issue with sorting others’ outcomes. I encourage you to read Lyubomirsky’s work on the subject, Daniel Gilbert’s Stumbling on Happiness, and Jonathan Haidt’s The Happiness Hypothesis, cited in the Selected Bibliography and Recommendations for Further Reading.)
Euphoria or misery, with no choices in between, is not a very self-compassionate way to live.
Living in the matrix is comfortable. So is the natural way we process information to protect our self-image in the moment. By choosing to exit the matrix, we are asserting that striving for a more objective representation of the world, even if it is uncomfortable at times, will make us happier and more successful in the long run.
Talking about winning (even if we are identifying mistakes along the way to a win) is less painful than talking about losing, allowing new habits to be more easily trained. Identifying mistakes in hands I won reinforced the separation between outcomes and decision quality.
A lot of people were surprised to learn that the expert opinion expressed as a bet was more accurate than expert opinion expressed through peer review, since peer review is considered a rock-solid foundation of the scientific method. Of course, this result shouldn’t be surprising to readers of this book. We know that scientists are dedicated to truthseeking and take peer review seriously. Arguably, there is already an implied betting element in the scientific process, in that researchers and peer reviewers have a reputational stake in the quality of their review. But we know that scientists, like judges—and like us—are human and subject to these patterns of confirmatory thought. Making the risk explicit rather than implicit refocuses us all to be more objective.
Within our own decision pod, we should strive to abide by the rule that “more is more.” Get all the information out there. Indulge the broadest definition of what could conceivably be relevant. Reward the process of pulling the skeletons of our own reasoning out of the closet. As a rule of thumb, if we have an urge to leave out a detail because it makes us uncomfortable or requires even more clarification to explain away, those are exactly the details we must share. The mere fact of our hesitation and discomfort is a signal that such information may be critical to providing a complete and balanced account. Likewise, as members of a group evaluating a decision, we should take such hesitation as a signal to explore further.
If someone is off-loading emotion to us, we can ask them if they are just looking to vent or if they are looking for advice. If they aren’t looking for advice, that’s fine. The rules of engagement have been made clear. Sometimes, people just want to vent. I certainly do.
Imagine you go up that same $1,000 in the first half hour but now, over the next hour and a half, you can’t seem to win a hand and lose $900 back, ending the night with a $100 win. How does that feel? Now imagine that you lost that same $1,000 in the first half hour but then went on a winning streak to end the night down only $100. How does that feel? Most likely, you’re pretty glum about the $100 win but still buying drinks for everyone after recovering from that terrible start to only lose $100. So you’re sad that you won $100 and happy that you lost $100. The way we field outcomes is path dependent. It doesn’t so much matter where we end up as how we got there. What has happened in the recent past drives our emotional response much more than how we are doing overall. That’s how we can win $100 and be sad, and lose $100 and be happy. The zoom lens doesn’t just magnify, it distorts. This is true whether we are in a casino, making investment decisions, in a relationship, or on the side of the road with a flat tire.
Oettingen recognized that we need to have positive goals, but we are more likely to execute on those goals if we think about the negative futures. We start a premortem by imagining why we failed to reach our goal: our company hasn’t increased its market share; we didn’t lose weight; the jury verdict came back for the other side; we didn’t hit our sales target. Then we imagine why. All those reasons why we didn’t achieve our goal help us anticipate potential obstacles and improve our likelihood of succeeding.