Updated: Jan 4
I’ve been thinking a lot about honesty lately. This is in part because I’ve recently moved significantly towards an “honesty all the time” policy,* both for personal reasons and because honesty and good faith are prerequisites for intellectual discourse. I credit the shift to the very good arguments of Sam Harris’s book “Lying” (though given the particular deontology of my Christian upbringing, my stance may have been overdetermined).
( * Before you object, we can exclude very extreme, thought-experiment-like circumstances. That’s not the point of this post.)
Beyond this, I’ve been wondering about the following question: Under what conditions can something be true, but you don’t want to know it, even in a world where everyone was trustworthy? I don’t mean the information can be abused by others; in my scenario, you can trust people around you to have the best intentions. And I don’t mean that it would be merely painful to hear in the short term. Is there a scenario where your long-run well-being is not compatible with the knowledge of some fact of the world? This is a high bar to set.
Bad example: Would you want to know how to make an atomic bomb? Absolutely you would. The physics that teach you how to make the bomb is the same physics that teach you a bunch of other useful things like nuclear power, and if everyone is trustworthy then they won’t abuse the information.
Bad example: Would I want to know that my wife was cheating on me? Of course I would. The fact of her cheating is likely a consequence of some malfunction in the relationship, or of some other incompatibility; only if I find out about this can my partner and I decide whether the two of us can work out the issues, or break up. Ignorance provides none of those options.
When I’ve brought the question up to people, I’ve gotten a few interesting responses (both quotes below are from one particular friend of mine). The first is:
People often tell a dying person everything will be okay when it is clear they are going to die. This is highly acceptable.
If someone is dying soon, with a high degree of certainty, then maybe the best thing for them is to happily live out whatever days they have left. I’m actually pretty sympathetic to this, in the specific scenario where death is imminent. On the other hand, in Sam’s book he presents various examples of people lying to each other about long-run health problems, like MS or treatable but dangerous cancers, and this seems much more sinister; there it’s better, I think, to be honest about what’s happening with your loved ones and get through difficult things together.
We often say false things to little kids to encourage rather than discourage them. Also acceptable.
Should we lie to kids? Here I’m less open to the idea. We of course shouldn’t tell kids about horrible things that they can’t handle at young ages, like genocide or terrorism for 5 year-olds. But my (naive?) view (as a non-parent) is that it’s ok to tell your kid “There are complicated things in the world that I’ll tell you about when you’re older.” It seems to me analogous to not letting your kid drive your car until they’re a teenager; you don’t need to hide your car in the meantime, just don’t give them full access right away. On other topics like (e.g.) the “wonderful!” art children produce, it doesn’t have to be deceptive to say “I love this! Great job! I’m proud of you!” even if it’s just scribbles; if you tell them they’re the next Rembrandt, then that’s a lie (and a useless one at that).
There’s some evidence that people who successfully self-deceive about their abilities perform better in competitions. Maybe that’s true, but it’s not a path forward for someone trying to optimize their beliefs. As Eliezer Yudkowsky has pointed out (as have others), actual conscious self-deception is either extremely hard or actually impossible. You don’t get to say “Starting now, I believe I’ll do better than I believed I would before, because that will improve my performance.” You aren’t in control of such things.
Here’s a different example, of my own conjuring: whiskey tasting. Imagine that I just don’t have the palate to distinguish a really nice scotch from (say) Jim Beam… but I think I do. When I drink what I know to be Beam, I think “Wow, this isn’t very good at all.” When I drink what I know to be a nice scotch, I instead savor the aroma and sip carefully, paying attention to the complex flavors. Now, if someone convinced me to do a blind taste-test and I discover that I really can’t tell the difference, maybe my life is made worse by this true information. Now when I drink the scotch, I’ll think “Why did I even pay for this? I might as well drink Jim Beam.” I can at least imagine that my self-deception was making my life better be convincing me that I enjoyed the scotch more than the Beam.
On the other hand, the knowledge that there is a real difference but that I can’t currently tell might inspire me to learn to notice more of the differences, e.g. by taking a tasting class. That first negative hit from the truth may open further doors which are positive.
(On the other, other hand, maybe it’s better to swing our mindfulness the other way, and start savoring every moment of Jim Beam too.)
What’s different about the whiskey example is that it’s a lie to yourself rather than a lie to someone else. Maybe there are things that are true about me such that I would be less-well-off if I found out. I don’t know.
Anyway, I’m kind of open to the idea that there are certain lies to oneself that could improve well-being. Between people, I’m much more skeptical and think usually justified lies are just symptoms of a deeper underlying mistrust. The long-run solution to that seems to be more honesty, not less.