Julia Galef is the host of the podcast Rationally Speaking (which I’ve listened to for years and recommend). She’s a rationalist concerned with improving the way she and others think. As a result, she often puts out material critiquing typical reasoning mistakes. As Sean Carroll pointed out recently when interviewing her, this tends to put a target on her back, since she’s frequently criticized by others when they perceive she fails to live up to the standards she espouses. However, Galef herself admits many of these failings and doesn’t hold herself out as perfect, just as someone who studies reasoning and strives to be better at it.
This is pretty much the purpose of her book, The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Galef begins by describing two mindsets in which to approach a proposition: the soldier mindset and the scout mindset.
The more natural mindset, and the one we most commonly fall into, is the soldier one. In this mindset, if we’re presented with a proposition we dislike, one that we’d prefer not to be true, we ask, “Must I believe it?”
On the other hand, if presented with a proposition we do like and want to be true, we’re more likely to ask, “Can I believe it?”
This is in contrast with the scout mindset, which asks, “Is it true?”
Galef admits up front that the soldier mindset isn’t always bad, and that it can have some benefits, including emotional ones such as comfort, self esteem, and morale, as well as social ones including persuasion, reputation, and camaraderie. Conversely, a scout mindset allows us to make better judgment calls. Although a good portion of the book makes the case we can actually get many of the soldier benefits from the scout stance because of the better judgment calls.
Obviously if we’re interested in truth, being a scout is the way to go. The problem is most of us take ourselves to be a scout, even when we’re not. It’s trivially easy to see the soldier impulse in others, particularly when we disagree with them, but very hard to detect it in ourselves. Galef provides a number of criteria to assess how close you might be to a scout. No one is perfect at all of these all the time. The idea is to assess how often you meet them.
- Do you tell people when you realize they were right?
- How do you react to personal criticism? This is more about track record than what attitude we think we hold. We’re all familiar with the boss that insists they want honesty from their subordinates, only to lash out when they actually get it.
- Do you ever prove yourself wrong, particularly after taking a public stand on something?
- Do you take precautions to avoid fooling yourself?
- Do you have any good critics, that is, critics you consider thoughtful, that make valid points, even if you ultimately disagree with them?
Galef offers a number of thought experiments to use to help us notice bias in ourselves. It’s worth noting these only have power if we truly imagine the alternate scenario. The kid who is asked if he would be okay with being hit the way he just hit another kid, and who claims he’d be fine with it, likely isn’t really imagining the alternate scenario.
The Double Standard Test: Are we holding one group to a different standard than another? For example, if a politician in the opposite political party is doing something we’re inclined to judge harshly, would we judge them the same way if they were in our own party (or another party closer to our own preferences)?
The Outsider Test: Would we come to different conclusions or make different decisions if we didn’t have our current background in relation to the matter?
The Conformity Test: Would our opinion be the same if others around us didn’t share it? Or if someone we admire didn’t hold it?
The Selective Skeptic Test: (This strikes me as a variation of the Double Standard one.) If the evidence supported the other side, how credible would we find it?
The Status Quo Test: If the current situation wasn’t the status quo, would we select it over possible alternatives we’re considering?
These lists are a good sample of the way the book flows. Galef organizes her content in lists like these throughout, generally devoting a few pages to each item.
There were a number of other points in the book I found interesting.
One is that people judge us on our social confidence more than our epistemic confidence. In other words, it’s not only okay to admit when we don’t know something, doing so can actually raise people’s assessment of us. Related to that is that leaders don’t need to make unrealistic promises of success to be inspiring. Galef notes the case of Jeff Bezos, who throughout the history of Amazon.com was completely honest with potential investors on how slim his chances of success were, an honesty that came from such an apparent place of competence that it actually attracted venture capitalists.
Another is a better way of thinking about how we change our minds. Rather than “admitting” we were wrong, Galef suggests thinking about it as simply “updating” our beliefs. She notes that it doesn’t take the sting away completely, but it does lessen it. It also helps if we had already admitted any uncertainties that might have existed with our previous position. She also notes that if we’re not at least occasionally changing our mind, we’re doing something wrong.
Related to this is having more realistic expectations about how others change their mind. Almost no one changes their mind quickly. Beliefs on any contentious topic are typically part of a constellation of interrelated beliefs, all of which may need to be changed, or at least adjusted, for the person’s mind to change on the belief in question. In other words, often a personal paradigm shift is necessary. So expecting a conversation partner to change their mind during the conversation is unrealistic. And we should be open to the possibility that we may be the one whose mind is eventually changed..
Toward the end of the book, Galef gets into the factor of identity, and how it often clouds our judgment, putting us into a tribal (soldier) mindset. Apparently identities can form around just about any subject matter. I was surprised to learn about the long standing conflict between mothers who breastfeed their babies and formula users. The animosity between these two groups seems like it’s more than a simple disagreement about infant nutrition.
Galef noted she once resolved to avoid identity labels such as “vegan”. Using such a label quickly conveys a lot of information that is awkward and tedious to convey otherwise. But it also tends to associate us with all the baggage tangled up in that identity, including the tribal conflicts with other identities. This reminded me of my own reluctance to accept labels, even when they mostly describe my outlook. Often it does come down to not wanting to be embroiled in that identity’s tribal conflicts.
Galef’s eventual solution was to accept (some) identities, but to wear them lightly, as things contingent and provisional, something that we hold to only as long as it describes our position or goals. Doing so allows for more flexible thinking. It allows us to say something like, “Yes I’m an X, but I don’t agree with those particular Xers”, and not feel obligated to defend people just because they’re on our team, or oppose others just because they’re on the other team.
An amusing aspect of the book is Galef’s annoyance with how rationalists are portrayed in fiction, notably Spock in Star Trek. She notes how often he fills the role of the “straw Vulcan”, the coldly logical character that ends up being wrong due to his lack of passion. She describes how Spock is often illogical, typically because he fails to take into account the illogical nature of those around him, or learn from his prediction misses. She has an entire appendix cataloging the times Spock is wrong in the original Star Trek series.
As someone always interested in improving my own reasoning, I found a lot of useful information in this book. It resonated with other techniques I’ve collected over the years on having productive internet conversations, fostering an open mind, or communicating across different levels of understanding. Something that might have strengthened it would have been a discussion about the role of emotions and how much they can cloud our reasoning. But all in all, if you’re interested in finding ways to think more clearly, this book is worth checking out.
What do you think? From the snippets provided here, are Galef and the rationalists on the right track? Hopelessly misguided? Are there other techniques that can help put us in the scout mindset?