top of page
  • Writer's pictureJosh

Worse Than Random

We all want to know what’s right, and to remove from our minds whatever is wrong. You might think that “wrong” is the worst thing to be, but in my field we have a category that’s considered worse.


A theory or calculation is called right if it corresponds to the way the world actually works, and it’s wrong if you test it / work out the details and find it to be flawed. (Ignoring complicated epistemological details here, but you know what I mean.) But if it’s so imprecise that you can’t test it, or if it’s so malformed that you can’t articulate it clearly, then we call that not even wrong; this phrase supposedly originated in a famously epic burn by physicist Wolfgang Pauli against a younger colleague. This is the harshest criticism as you can level against a theory: it's not even wrong.


For example, consider the mechanism for the Earth’s orbit around the sun.

  • The right answer is (as far as we know) that the Earth is held in orbit around the sun by gravity, which is governed by Einstein’s equations of General Relativity. (Actually Newtonian gravity gets the effects right but fails to provide a mechanism.)

  • An example of a wrong answer is the Le Sage / Fatio model of ultra-small particles bouncing off of massive bodies (explained well in this Feynman Lecture, starting around 7:45); we know it’s wrong because at least one of its consequences is inconsistent with experiment.

  • A not even wrong answer is that invisible and undetectable angels push the planets around by beating their wings (also explained by Feynman, starting around 17:09). It fails to achieve the status of wrong because it can’t be tested: angels evade detection by fiat, so no observation could possibly falsify the idea.

Or consider the question of what atoms are made of.

  • The right answer is that negatively-charged electrons exist in a quantum ‘probability cloud’ bound to a central positively-charged nucleus, which contains protons and neutrons that are themselves made of quarks and held together by the strong nuclear force.

  • We know many theories to be wrong, including ones that we used to think were right, like the plum-pudding model, where electrons float around like plums in a positively-charged “pudding”. This idea is inconsistent with e.g. Rutherford-type scattering.

  • A not even wrong theory is “squish theory“, which sometimes gets emailed to unwitting physicists who have the audacity to believe in quarks. I’ve tried to read it, I’ve tried to understand it, and I can’t help but come away with the sense that it’s not clearly-defined enough to know where to begin figuring out if it could be right.

An analogy: there is a large bag full of blue and red balls, which represent correct and incorrect beliefs (respectively). To be correct you have to pull a blue ball; to be wrong is to pull a red one. Of course, there are better and worse ways to find blue balls and avoid red ones. But to be not even wrong is to refuse to pull a ball at all, instead arguing that whatever you hold in your hands is more-or-less as good as a blue ball.


\\


Let’s consider a different epistemological axis, which I’ll call systematic vs random. A conclusion is random if it is pulled without discrimination from a set of propositions, whereas it is systematic if it is thoughtful and carefully-reasoned. Note that to be systematic does not necessarily imply you're right. There are lots of conclusions that might be systematic but still wrong. For example, you might have all available evidence, and have thought it through systematically, but that evidence may not be very good. Or you might have half of the story and it fits together perfectly before you discover a second set of equally-compelling evidence that changes everything.


There are also plenty of systematic beliefs that are systematically wrong. For example, my system might be "believe whatever my astrologer tells me" or "believe every third thing I hear"; these are systems, but probably bad ones for the purpose of finding true beliefs. They are nonetheless systematic by my definition.


Random probably seems like the worst you could do; after all, almost everything you could believe is false, which is to say, most balls in the bag are red. As a result, it must be nearly impossible to pull a rare blue ball from a giant bag of red ones at random. There are better and worse ways to find blue balls, and the random/systematic axis is about the method of finding blue balls rather than the result. You can do better by searching carefully (direct observation), by developing tools of sorting (improved methodology), by seeing if they tend to be on one side of the bag or the other (developing theory). These are systematic ways of finding the truth with higher and higher fidelity.


Still, as with right/wrong/not even wrong, I think there’s a category that is not systematic and still worse than random.


I’ve heard since childhood that a million monkeys typing on a million typewriters for a million years will eventually reproduce Shakespeare. (I’m sure I have the magnitudes wrong here, but see the aptly-named Infinite Monkey Theorem for details [1]). This works, in fact, because pure randomness eventually samples every possible state of the system; it is ergodic. If your random monkey brain selects beliefs totally at random, it will be wrong way more often than not, but eventually it would have to hit on a correct belief. There are only so many balls in the bag to choose from, after all.


What you would want to avoid is a situation in which you never have a chance to pull a blue ball at all. Of course, there are systematic ways to fail to sample correct outcomes, e.g. you only pull from one side of the bag because of some personal bias. But there are non-systematic ways too, and these are what I want to call Worse Than Random.


A conclusion is WTR if it is drawn quasi-randomly and then never reflected on. In short, you’ve put no effort into that belief. You grabbed whatever ball touched your hand first: you believed X either because a friend told you or you heard it on the news and accepted it reflexively. But then, rather than keeping your credence set to a low level (appropriate for something on which you have reflected zero), you reflexively cranked it up.


The thing is, though, you haven’t thought it through. Not even a little. You’ve never been pressed, you’ve never tried to articulate it, you’ve never laid out the evidence for and against. Maybe on reflection you would decide you believe your friend, or that you can trust whatever news source or expert gave the take. Sometimes, appeal to authority is appropriate.


But you didn't reflect; you just accepted the view.


And it nonetheless feels like you hold the view strongly.


This is what I mean by WTR.


\\


When I reflect on this tendency personally, I observe many topics on which I find myself subjectively feeling like I hold a strong view, without having ever reflected on it at all:

  • Cultural differences between Japan and China, between Iran and Iraq, or between France and Switzerland;

  • Which milks are the healthiest for human consumption;

  • The prevalence of ADHD in America;

  • Vaccine safety (update August '21: I've thought about it much more now and hold a real view of my own);

(It's very hard to come up with examples, because thinking of the example is the first step towards breaking the WTR spell, and almost by definition I have never done that with these topics.)


It feels like I have a strong opinion on these topics; when I bring them to mind a conclusion quickly follows. But even 2 seconds of reflection reveals that nothing is underneath to gird the opinion in my mind.


Where did these views come from? If someone really challenged me on them I would be unable to muster even a small amount of pre-baked thoughtfulness. If, on one of these topics, I am tending to side with the prevailing view or the view of some expert, it is not because I’ve reflected on the topic and determined that this sort of topic is best adjudicated by experts in the field. No, rather I have truly never thought about it, and that’s not ok.


I’d much rather have thought things through and be wrong, than to hold a view without reflection. At least in the former case, I would be capable of having a conversation about the topic, learning and becoming less wrong about it. As it stands, I haven’t even put in the smallest modicum of effort to make me worthy of holding a view. I haven't even earned the status of saying my belief was random. Frankly, I'd better just shut up about it.


One possible explanation of WTR thinking is "my side bias"; first impressions stick, as humans have a tendency to lock in to whatever they think might be true first. Once you have committed to a view, there’s some social cost to being a “flip-flopper”. It’s as though you pull the first ball from the bag, a pull which was random, but you proceed to decide that it’s YOUR ball and you must defend it against all comers. If anyone asks you what is the right answer, you answer that THIS IS IT, and you know that with the confidence of someone who has never thought about it even a little.


Leo Tolstoy said:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

This is WTR in action.


\\


What are some ways to notice when your beliefs are Worse Than Random? And what can you do about it?


Personally, I usually can’t tell until I let my beliefs out of my head, either by having a conversation or by writing them down carefully. That's a big part of the reason for this blog. Often I have the feeling that I know something well, and then I try to write down what I think, and discover that I know nothing and my strong belief is based on nothing. Or I have a conversation, and find myself arguing for a view that I discover I do not even hold; it's as if an alarm bell goes off in my head reminding me that I am making up my view on the fly and compelling me to be quiet. If you experience this, don't fight it; it is important. You might call this realization a WTR moment.

Wile E. Coyote believes he can keep running beyond the cliff; turns out, the belief isn't based on anything, and all he has to do is look to see that he must fall. That's a WTR moment.


If you feel this yourself, you can of course see it as impetus to do some research, to learn about the thing about which you know nothing. But beware! Your strong view, WTR-ly held, may deceive you into reading the evidence in a biased way. Before you begin, use the WTR moment as a palette-cleanser, and consciously try to dissolve the feeling of holding a strong belief. "I know nothing about this, even though I feel like I do. Huh. I wonder what's actually true about this topic?" Get curious, and use that curiosity to fuel your search. But first, let yourself fall to the bottom of the cliff: relinquish your belief, to the extent that you can.


(Also, notice if, upon reflection and research, you always seem to "discover" that you were right all along; you may be more biased than you think. It would be surprising if your random and WTR beliefs turned out to be true by luck alone.)


Alternatively, after you use the WTR moment to relinquish the belief, it's ok to replace it with nothing. Become comfortable with uncertainty, because it may not be worth your time to figure everything out for yourself; you can't know everything about everything, so why do you always need to have an opinion? You'll have to sit a few conversations out, saying "Actually I don't have anything to say about that," but that's a much better situation than backing a view you don't even hold (though you think you do).


\\


This quote by Robert Wilensky perfectly sums it up, I think:

We’ve all heard that a million monkeys banging on a million typewriters will eventually reproduce the entire works of Shakespeare. Now, thanks to the Internet, we know this is not true.

Beware, because like people on the internet, you are worse than a million monkeys; many of your own ideas are likely Worse Than Random.


----------------------------------------------------------------------------------------------------------------------


[1] By the way, someone actually did reproduce the complete works of Shakespeare, using a million (digital) monkeys, nine characters at a time. Check it out.

20 views0 comments

Recent Posts

See All
bottom of page