top of page
  • Writer's pictureJosh

Murder-Ghandi and Conspiracy-Bob

(Or, "Bad Idea Radiation")


Scott Alexander tells the parable of Murder-Ghandi, who starts out as regular Pacifist-Ghandi, thinks war is bad, and wants to solve things with non-violent means like hunger strikes. But Pacifist-Ghandi is offered a pill that will decrease his reluctance to murder by 1% in exchange for $1 million, thereby becoming 0.99*(Pacifist-Ghandi) = 0.99PG = 0.01MG. He's still very pacifistic and has $1 million to improve the world with. Sounds like a great deal for everyone.


The problem comes when you iterate: Offer 0.99PG the same deal to become 0.98PG, and he will take it a bit more readily than the original PG did. And so on; anyone with $100 million could easily turn PG into murder-crazed MG. They probably would need less than that: if PG accepted the deal for $1 million, maybe 0.9PG would accept $900,000; and 0.05PG = 0.95MG would probably take the pill for next to nothing. The slope becomes slipperier as you slide down it.


Scott's solution is that PG could decide, in advance, that the deal is worth it to his current self only the first N times, and find some way of holding future-him to that standard in spite of that person's different opinions. PG could "swear a mighty oath" or "give all his most valuable possessions to a friend and tell the friend to destroy them if he took more" than N pills, thereby making sure future-him won't thwart present-him's goals.


I wonder if this applies to other things. I've said publicly that it's a good idea to talk to crackpots, if only to treat their wild-and-probably-false theories as raw material to build new and better ideas from. (Spencer Greenberg wrote something similar, not about crackpots specifically, but about learning from one data point.) Even if crackpot ideas are wrong, they are probably a wellspring for novelty, and you might have new insights about something else as a result of a conversation with someone who thinks so differently.


And yet, I worry about the Murder-Ghandi example. Suppose Bob believes some obviously-true thing like "The US government is not secretly run by Lizard People." But Bob wants to open himself to "the other side", or at least to the novelty of a very different way of thinking. So Bob, a 0% LPC, decides to talk to someone who's totally convinced of the LP hypothesis. He knows they're wrong (as do I), but he tries to listen anyway, and they may have some ideas he's never heard (that's the point, after all). At the end, Bob still thinks LP is bunk, but he guesses maybe the stuff about the Bible is kind of interesting, and Mr. LP is not totally wrong to think the core of humanity is consciousness or "infinite awareness" (it's in vogue now to consider that consciousness may be an essential building block of existence). So Bob maybe ends up moving a tiny bit, on the margin, not to the believe in LP per se, but you know, maybe there are interesting unexplained historical connections, and maybe that the government does more creepy stuff than he'd previously considered.


There's evidence that there's a sort of general factor of conspiracy-thinking: if you believe the Earth is flat, you're more likely to believe in Chemtrails or be a 9-11 Truther or whatever. Some of this is probably genetic, because everything is a little bit genetic. Some of it, though, is good Bayesian thinking: If you think there's a nefarious Deep State capable of pulling off a 9-11 conspiracy, then given that belief, it's more rational to think such a group might do other nefarious things; or generally, if you see lots of true conspiracies, you update in favor of other conspiracies.


Still, some of it is not quite rational, but instead animal; it feels good to identify a bad guy, to uncover a secret, to know something others fail to see. Some people identify with their beliefs, and so having extreme ones make them feel more interesting; others, I think, just have a fetish for contrarian and counter-culture thinking. We're all on these spectra, so they are things to watch out for.


So Bob's conversation with Mr. LP makes tomorrow's conversation with a 9-11 Truther a little more interesting. Again, Bob is as close to a 0% 9-11 Truther as one can be, but after his LP discussion maybe he becomes a tiny bit more sympathetic to these ideas than before. If the Truther can move the needle even a little bit then we're back to Murder-Ghandi.


The Murder Pill is a good example, but it doesn't capture the extent to which things like this happen in the background of our minds all the time, without us knowing. I like to think of it as sort of like radiation. Radiation is fine, in small doses; everything emits some amount of radiation of some energy, and usually your body absorbs it without harm. But stand too close to a strong source, and you start to take damage. You can put up shield and guard yourself in various ways--studying Rationality is itself a shield against bad-idea radiation--but sometimes you just have to identify the source and resolve to move away from it. Wait too long, and it could be too late to save you.


Because we're not all perfect Bayesians, this happens naturally, especially when our interlocutor is someone we like or that we think is smart in other domains. In some sense, this is what communities do; they commit a kind of brainwashing without even trying. If everyone around you says "Evolution is true" and "Vaccines work", then after a long exposure you're very likely to believe that. If everyone around you keeps saying "9-11 was an Inside Job", then it's going to be very hard to not let the needle move in that direction. And once it moves, the next time is a little easier. And a little easier. Death of rationality by a thousand little tick-tick-ticks on the Geiger counter.


A few years and a few hundred conversations later, Bob has become Conspiracy-Bob, complete with a YouTube channel about Flat Earth and Ancient Aliens. The radiation has changed him into a mutant version of himself that he'd never have approved of before.



(Yes yes, I know, some conspiracies are true and so not all conspiracy theories are bad. You know what I mean.)


\\


What to do about this? Well, if you were a perfect Bayesian updater then there would be little reason to worry; you'd take the best evidence Mr. LP and Mrs. 9-11 had to offer, discard the rest, and update your beliefs just the right amount. A dedicated and dispassionate focus on evidence is, as usual, the best answer.


But you're not perfect. And, like Pacifist-Ghandi, you may be skeptical of future-you's ability to stay true to your current principles, if you start to move in the other direction even a little. Take a few (rational!) steps towards the cliff, and you may feel an unexpected urge to peek over the edge. So, maybe some caution is in order. I understand the impulse of those that say, "Keep that idiotic conspiracy away from me, I'm trying to do useful work over here!" It's warning against "a waste of time", sure, but it's also fear and self-preservation. They miss out on whatever is novel and useful about crackpots, but maybe it's for the best.


Or, you could take Scott's advice from the original parable: set down a fence in advance that you cannot move past. You vow, "No matter how strong it sounds like the evidence is, I will update at most to 0.95*(Lizard-Person-Believer), because future-me that is 0.95LPB can't be trusted to update appropriately on the topic." Maybe this is possible to do (though knowing your own degree of belief is very difficult). This is, of course, a justification for irrationally strong beliefs and dogma, but again, it's a guard against much worse outcomes that can arise because you fail to know yourself.


My personal attitude is to not fear the radiation, but make sure that I'm coupled to a sufficiently strong sink of good ideas in case I start to go over the edge. I'll talk to the Lizard Person Believer, and maybe my needle will move a bit, but then I'll discuss it with other people with more conventional views; they will either go along with my move (a good sign that I was rational in the first place) or pull me back. If I move too far, I trust my friends to quickly rescue me from disaster. After all, humans may be at their most rational when they are in groups that are willing to challenge one another, more rational than even the best individuals.


There's also a general point about how one lives one's life here, which is: Surround yourself with good ideas most of the time. They say you are the average of the five people you spend the most time with; make sure they are making you stronger, not weaker, so that you can withstand some bad ideas from time to time without running to the bomb shelter.


\\


On a personal note: maybe you are setting out to change minds, rather than learn about something yourself. Unfortunately, it seems to be almost impossible to change someone's mind about a topic they feel strongly about, if you make that your goal at the outset. People want to be listened to; they want to explain themselves, and have their partner really listen. If you cut off their explanations, roll your eyes, or jump in with things like "That's stupid, here's how you need to think about it", they're going to dismiss you as quickly as you have dismissed them.


Often people who believe "crazy" things know how people view them, and that becomes part of their identity as well. If you actually want to reach these people, you have to meet them halfway--or, at least partway. In what ways is their thinking very plausible or even correct? You have to open yourself up or else they'll just see you as an enemy, in which case nobody involved is likely to learn anything.


Still, be careful and know yourself before taking the first few steps towards a nuclear waste pile.

22 views0 comments

Recent Posts

See All
bottom of page