Humankind seems to have invented reasoning, and it has mostly served us well. But reasoning evolved as a skill for plains apes, who included it as a tool for the bigger goal of cooperative living. It’s increasingly clear that reasoning as a tool can be subverted to that larger goal. Three books WC has read recently have just reinforced that point.
WC has already written about Michael Lewis’s The Undoing Project, which describes the work of Daniel Kahneman and Amos Tversky, two men who revolutionized the understanding of human behavior and heuristics. They systemically established that we all have cognitive biases which impair our ability to objectively receive and analyze information. That was bad news, for those of us who have set our goal as trying to inform folks and maybe change their minds. But Kahneman and Tversky didn’t address the reasons for the cognitive biases and heuristics that they identified.
Jack Gorman and Sara Gorman, in Denying to the Grave: Why We Ignore the Facts That Will Save Us, analyes the gap between what science tells us and what we tell ourselves. Sara Gorman, a public health specialist, focuses on the vaccine “controversy.” There, the divergence between fact and falsity – that vaccines are dangerous – is potentially deadly. In fact, it is self-destructive; in evolutionary terms, it is maladaptive, contrary to the choice that would carry the greater chance of survival. The Gormans speculate the at some point the behavior, the seeming inability to sensibly evaluate risk, must have been useful. And they tie it to the necessity to cooperate.
Early on in modern humans, they argue, specialization began to develop. No single person could hold all the knowledge necessary for survival. Not everyone could be a great hunter; not everyone could be a great knapper; not everyone could recognize useful plants. You relied on the expertise of others to survive. That reliance, that deference to the knowledge of others, has it roots in today’s problems with “alternative facts,” with confirmation bias and with the heuristics Kahneman and Tversky identified.
Steven Sloan and Philip Fernbach wrote The Knowledge Illusion: Why We Never Think Alone, which largely parallels the Gormans’ conclusions. “As a rule,” they conclude, “Strong feelings about issues do not emerge from deep understanding.”1 Since Sloan and Fernbach conclude we tend to rely on the opinions of others in our circle without regard to the logic or rationality of those opinions. Ten people agreeing with the unfounded opinions of one person don’t make the unfounded opinion true, but it does lead to Donald Trump.
One of their experiments suggests a possible solution to this conundrum. They asked participants to rate their positions on issues issues like gun control, universal health care and merit-based pay for teachers. Then they were asked to explain, in detail, as specifically as possible, the impacts of of implementing some of those issues. Then the subjects were asked to again rate their positions on those issues. In the case of those issues they had been asked to think through, there was a measurable softening of their positions. There was no similar softening on issues they had not been asked to explain.
So it is possible to soften, at least, the inclination to a herd mentality in the face of a challenge. If you want a more specific example of the effect, think of Donald Trump’s reaction to Googling and learning a little bit about Obamacare. Oh, wait, that’s fake news.