Home » Library » Modern Library » Review of The Scout Mindset

Review of The Scout Mindset


(2023)

[This book review is a slightly modified version of a review originally published on the author’s SelfAwarePatterns blog.]

 Review: Julia Galef. 2021. The Scout Mindset: Why Some People See Things Clearly and Others Don’t. New York, NY: Portfolio. 288 pp.

Julia Galef is the host of the podcast Rationally Speaking, which I’ve listened to for years and recommend. She’s a rationalist concerned with improving the way she and others think. As a result, she often puts out material critiquing typical reasoning mistakes. As Sean Carroll pointed out when interviewing her, this tends to put a target on her back, as she’s frequently criticized by others when they perceive that she fails to live up to the standards that she espouses. However, Galef herself admits many of these failings, and doesn’t hold herself out as perfect, but just as someone who studies reasoning and strives to be better at it.

This is pretty much the purpose of her book, The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Galef begins by describing two mindsets in which to approach a proposition: the soldier mindset and the scout mindset.

The more natural mindset, and the one that we most commonly fall into, is the soldier one. In this mindset, if we’re presented with a proposition that we dislike—one that we’d prefer not to be true—we ask: “Must I believe it?”

On the other hand, if presented with a proposition that we do like and want to be true, we’re more likely to ask: “Can I believe it?”

This is all in contrast with the scout mindset, which simply asks: “Is it true?”

Galef admits up front that the soldier mindset isn’t always bad, and that it can have some benefits, including emotional ones such as higher comfort, self-esteem, and morale, as well as social ones including better persuasion, reputation, and camaraderie. Conversely, a scout mindset allows us to make better judgment calls. A good portion of the book makes the case, however, that we can actually get many of the soldier benefits from the scout stance because of the better judgment calls.

Obviously if we’re interested in truth, being a scout is the way to go. The problem is that most of us take ourselves to be scouts even when we’re not. It’s trivially easy to see the soldier impulse in others, particularly when we disagree with them, but very hard to detect it in ourselves. Galef provides a number of criteria to assess how close you might be to a scout. No one scores perfectly on all of these qualities all the time. The idea is to assess how often you meet them:

  1. “Do you tell other people when you realize they were right?” (p. 51)
  2. “How do you react to personal criticism?” (p. 52) (This is more about track record than about what attitude we think that we hold. We’re all familiar with the bosses that insist that they wants honesty from their subordinates, only to lash out when they actually get it.)
  3. “Do you ever prove yourself wrong?” (p. 53)—particularly after taking a public stand on something?
  4. “Do you take precautions to avoid fooling yourself?” (p. 55)
  5. “Do you have any good critics?” (p. 56) That is, are there critics that you consider thoughtful, that make valid points, even if you ultimately disagree with them?

Galef offers a number of thought experiments to use to help us notice bias in ourselves. It’s worth noting that these only have power if we truly imagine the alternate scenario. The kid who is asked if he would be okay with being hit in the same way that he just hit another kid, and who claims in response that he’d be fine with it, likely isn’t really imagining the alternate scenario.

  • The Double Standard Test: Are we holding one group to a different standard than another? For example, if a politician in the opposite political party is doing something that we’re inclined to judge harshly, would we judge them in the same way if they were in our own party (or another party closer to our own preferences)?
  • The Outsider Test: Would we come to different conclusions or make different decisions if we didn’t have our current background in relation to the matter?
  • The Conformity Test: Would our opinion be the same if others around us didn’t share it? Or if someone that we admire didn’t hold it?
  • The Selective Skeptic Test: If the evidence supported the other side, how credible would we find it? (This strikes me as a variation of the Double Standard Test.)
  • The Status Quo Test: If the current situation wasn’t the status quo, would we select it over possible alternatives that we’re considering?

These lists provide a representative sample of the flow of the book. Galef organizes her content in lists like these throughout it, generally devoting a few pages to each item.

There are a number of other points in the book that I find interesting:

  • One is that people judge us on our social confidence more than our epistemic confidence. In other words, it’s not only okay to admit when we don’t know something, doing so can actually raise people’s assessment of us. Related to that is that leaders don’t need to make unrealistic promises of success to be inspiring. Galef notes the case of Jeff Bezos, who throughout the history of Amazon.com was completely honest with potential investors on how slim his chances of success were, an honesty that came from such an apparent place of competence that it actually attracted venture capitalists.
  • Another is a better way of thinking about how we change our minds. Rather than “admitting” that we were wrong, Galef suggests thinking about belief change as simply “updating” our beliefs. She notes that this rendering doesn’t take the sting away completely, but it does lessen it. It also helps if we had already admitted any uncertainties that might have existed with our previous position. She also notes that if we’re not at least occasionally changing our minds, we’re doing something wrong.
  • Related to this is having more realistic expectations about how others change their minds. Almost no one changes their mind quickly. Beliefs on any contentious topic are typically part of a constellation of interrelated beliefs, all of which may need to be changed, or at least adjusted, for the person’s mind to change on the belief in question. In other words, often a personal paradigm shift is necessary. So expecting a conversation partner to change their mind during the conversation is unrealistic. And we should be open to the possibility that we may be the one whose mind is eventually changed.

Toward the end of the book, Galef gets into the factor of identity and how it often clouds our judgment, putting us into a tribal (soldier) mindset. Apparently identities can form around just about any subject matter. I was surprised to learn about the long-standing conflict between mothers who breastfeed their babies and those who use formula. The animosity between these two groups seems like it’s more than a simple disagreement about infant nutrition.

Galef notes that she once resolved to avoid identity labels such as “vegan.” On the one hand, using such a label quickly conveys a lot of information that is awkward and tedious to convey otherwise. But it also tends to associate us with all of the baggage tangled up in that identity, including the tribal conflicts with other identities. This reminded me of my own reluctance to accept labels—even when they mostly describe my outlook. Often this reluctance does come down to not wanting to be embroiled in that identity’s tribal conflicts.

Galef’s eventual solution was to accept (some) identities, but to wear them lightly, as things contingent and provisional, things that we hold to only as long as they describe our position or goals. Doing so allows for more flexible thinking. It allows us to say something like “Yes, I’m an Xr, but I don’t agree with those particular Xrs” without feeling obligated to defend people just because they’re on our team, or to oppose others just because they’re on the other team.

An amusing aspect of the book is Galef’s annoyance with how rationalists are portrayed in fiction, notably Spock in Star Trek. She notes how often he fills the role of the “straw Vulcan,” the coldly logical character that ends up being wrong due to his lack of passion. She describes how Spock is often illogical, typically because he fails to take into account the illogical nature of those around him, or learn from his prediction misses. She has an entire appendix cataloging the times that Spock is wrong in the original Star Trek series.

As someone always interested in improving my own reasoning, I found a lot of useful information in this book. It resonated with other techniques that I’ve collected over the years on having productive Internet conversations, fostering an open mind, or communicating across different levels of understanding. Something that might have strengthened it would have been a discussion about the role of emotions and how much they can cloud our reasoning. But all in all, if you’re interested in finding ways to think more clearly, this book is worth checking out.

Copyright ©2021 by Mike Smith. This electronic version is copyright ©2023 by Internet Infidels, Inc. with the written permission of Mike Smith. All rights reserved.