Listening to a Rationally Speaking episode from last year (ostensibly about the debate over sex differences in the human brain), I ran across this gem of a quote from Julia Galef:
…maybe when people are giving their arguments or stating their position, what they’re doing (unconsciously, probably) is not trying to state exactly what they think the truth is, but they’re trying to state something that will move the overall consensus closer to what they think the truth is. It’s sort of like, if everyone was voting on how they wanted money to be spent, or something, in the budget, and I actually thought that we should spend 1/4 of the money on education, but everyone else thought we should spend only 10%. I know that my vote it’s going to count for much, so in order to get us to 1/4 I have to say that “we should spend 90% on education” or something, and everyone’s doing that. And so when you listen to what people think the divide between innate and socialized differences is or something, they’re not quite saying what they really think, or at least their topic sentence, the headline, of their position is not saying what they really think: it’s saying what they think will move the debate towards the right thing, according to them. And it makes things so confusing.
I’ve had this thought before but never heard it expressed so eloquently, and I find the idea both very correct and very upsetting.
It’s like you’re playing Tug Of War, that game where you and your team pull a rope against another team, except in this version your team has the rational goal to just to hold the rope steady at a particular point:
The game starts at center. Team A pulls the rope a little to the left, but Team B reacts by pulling a little harder than Team A to get the rope back to the right. Team A pulls a little harder, so of course Team B pulls a littler harder, and so Team A pulls harder, and so Team B pulls harder, until soon both teams are pulling as hard as they possibly can. It would seem like a miracle if the net result of both sides pulling as hard as they could ended up with a thoughtful, well-reasoned result (I’m still assuming Teams A and B are thoughtful, reasonable people). But at every step, both teams did what was reasonable to do.
In Julia’s example, members A and B of the budget committee might think 10% or 25% (respectively) of the budget should be allocated to education. But when the argument begins they might argue that 0% and 90% of the budget should be allocated to education. Then the whole thing becomes a negotiation rather than a discussion, which is not a good format for determining what is true or what is effective in the world outside of A and B.
The flaws here seem to be: (1) setting your goal as fixed before the conversation begins; (2) treating the “other team” as negotiating partners rather than interlocutors from whom one might learn. And this is assuming the teams have good, rational arguments for their views! The situation is already dire, and we haven’t even accounted for the additional costs of tribalism, the sunk cost fallacy, and other bad reasoning!
At the risk of making this longer than I planned, here are some examples that come to mind:
Julia’s podcast (above) was about the debate about sex differences in the brain. She gives the example of one outspoken critic of the idea of sex critics, Lise Elliot. Dr. Elliot is quoted as saying, in a Nature article: “The brain is no more gendered than the liver or kidneys or heart.” Yet in her work she has a nuanced view, and is apparently very open to (e.g.) the idea of innate brain differences between boys and girls at birth. Julia was suggesting that Dr. Elliot is playing Tug of War.
Donald Trump was just some weird billionaire guy who hosted a reality show and appeared in Home Alone 2. Nobody cared about him, or thought he was particularly great or particularly evil. Then he ran for president, and 5 years later he’s on one hand sent by God to save the country from hell (maybe even a prophet himself), or on the other hand he’s Literally The Second Coming Of Hitler. I claim everyone involved is playing Tug of War.
I want to say that the best way to win this game is not to play, but that’s only true if everybody stops playing Pull As Hard As You Can. The position of the rope here determines public policy or public opinion, and both matter in the real world; those people who pull as hard as they can have a natural advantage against those of us pulling just hard enough. So our Tug of War game is also a prisoner’s dilemma, where we know many people will defect by making the most extreme claims possible. Our best outcome is to cooperate, putting down the rope to discuss dispassionately the goal of both teams and trying to agree on what’s best; on a large scale, it’s hard to imagine getting there.
I don’t have any solutions to this, and I end this post feeling unsettled and filled with despair.
(By the way, note that Robin Hanson also compared policy debates to Tug of War (in a different way), and wrote a great post about it.)