This post was originally published on this site
We all know what a persuasive stalemate feels like. We know the awful grind of making the right points, marshaling the right evidence, arguing with good faith, and making absolutely no progress at changing someone’s mind.
Nowhere is it more common than in political conversations (which may be too polite a word for what most of us find ourselves in): The entire country feels like it’s stuck in one big wheel-spinning persuasive failure, where the same facts seem to give different verdicts or even agreeing on the facts seems impossible.
So what can we do about the political stalemates we find ourselves in, other than keep sinking hours into the same conversations over the same dinner tables that accomplish nothing except making us want to beat our interlocutors and sometimes ourselves with the salt shaker?
I’ve spent two years interviewing people who changed their minds about the beliefs that were most important to them. I spoke to people who left cults, people who found out their families were criminals, people who discovered their identity wasn’t what they thought it was, people who had to rethink the accuracy of their own memories. They all taught me about how hard it is for minds to really change — and how strange it is talk about changing “minds” as though minds weren’t attached to people. When we try to change a belief, we are often trying to change a person.
Here is the bad news: I cannot, with any confidence, give you a fail-safe guide to changing a person’s mind. I can say with confidence that anyone who sells you a persuasive strategy that claims to know how to change a person’s mind without intricate knowledge of their circumstances or the genealogy of their beliefs is either lying or grievously mistaken. But I can give us one strategy that I think we overlook too often in our political debates, from the dinner-table kind all the way up to the televised kind with the suits and the hairspray-lacquered journalists.
The strategy is this: Ask where people’s beliefs come from as well as what they actually believe.
All too often we pay attention to the outputs of people’s belief systems and not the systems themselves. We preoccupy ourselves with what they believe at the expense of why they believe it. But often the beliefs themselves are just the visible surface of a complex personal system, and trying to remove the belief itself without looking at its causal history is like cutting off the visible leaves of a weed and never wondering about its roots.
This is true for all sorts of beliefs, but it’s especially for political ones: we so often busy ourselves with slinging facts and evidence at each other that we forget to ask where our opponents’ belief came from. And if their belief didn’t spring from evidence, then evidence might not be the best tool to remove it.
Let me give you an example. Dylan used to be a member of a strict apocalypse-heralding sect. In a lot of ways its structure mirrored that of a particularly tight political group: it was suspicious of outsiders, cultivated a mistrust of people who suggested that they might be wrong, punished people with social exclusion for associating with nonbelievers and was organized around the charismatic authority of leader-figures. It sounds worryingly familiar to anyone whose family members have found a political identity and clung to it like it’s driftwood and they’re drowning.
Dylan changed his mind — in his 20s, after having been raised in the group. An impressive change. But he didn’t change his mind because of any evidence against one of his beliefs. It wasn’t about what he believed. It was about how he’d come to believe it.
He fell in love with a woman named Missy, a nonbeliever (though she didn’t tell him that at the time — in fact she spent five years pretending to believe what he did). His elders never liked Missy, and he couldn’t fathom why. When the elders told him to choose between his wife and salvation, he realized, he trusted his wife more than them. And since his beliefs about the apocalypse had only arrived because he trusted them, changing his mind about his elders’ character meant changing his mind about every other one of his beliefs.
This wouldn’t have been possible if Dylan had simply had debates about the evidence for and against his beliefs. In fact he did; Missy argued with him all the time. Until he had to look at the causal chain behind his beliefs, the beliefs themselves stayed strong.
Our political debates and the ways they fail have a lot in common with Dylan’s story. Many of our political views didn’t arise out of evidence, and yet we persist in thinking evidence is the best way to displace them. This is how it’s possible to sink vast amounts of energy and money into “debating the facts” — between candidates on TV in front of red white and blue banners, or in the wormholes of an online comment section — and still see astonishingly low rates of mind-changing.
Debates like these start to feel like recreational combat. They draw viewers, and they feel like entertainment, but they don’t bring us to better beliefs. In fact psychological experiments suggest that hearing evidence against what we believe just makes us believe it more.
This isn’t an argument against speaking to each other. It is an argument against speaking to each other in the ways we always have, with the same presumptions about which debates work and why. Our usual “clash of ideas” models deal with the results of our belief systems, not the systems themselves. If we want to make political progress we don’t need more debate — we have plenty. We need to look at the roots of people’s beliefs.
Eleanor Gordon-Smith is a philosopher and radio producer. Her work has appeared on “This American Life,” the Canadian Broadcasting Corp. and the ABC. She is the author of “Stop Being Reasonable; How we really change our minds.” Follow her on Twitter @TheRealEGS.