Caution On Bias Arguments
“You say it’s important to overcome biases. So isn’t it hypocritical that you’re not trying to overcome whichever bias prevents you from realizing you’re wrong and I’m right?”
— everybody
Correcting for bias is important. Learning about specific biases, like confirmation bias or hindsight bias, can be helpful. But bias arguments – “People probably only believe X because of their bias, so we should ignore people who say X” tend to be unproductive and even toxic. Why?
1. Everyone Is Biased All The Time
You could accuse me of having a conservative bias. After all, I’m a well-off straight white man, a demographic well-known to lean conservative. If a liberal wanted to discount everything I say, or assume any conservative arguments I make come from self-serving motives, they’ve got all the ammunition they need.
Or you could accuse me of having a liberal bias. After all, I’m a college-educated atheist Jewish psychiatrist in the San Francisco Bay Area. All of those demographics are well-known to lean liberal. If a conservative wanted to discount everything I say, or assume any liberal arguments I make come from self-serving motives, they’re not short on ammunition either.
This is a general phenomenon: for any issue, you can think of biases that could land people on one side or the other. People might be biased toward supporting moon colonization because of decades of sci-fi movies pushing space colonization as the wave of the future, or because Americans remember the moon landing as a great patriotic victory, or because big defense companies like Boeing will lobby for a project that would win them new contracts. Or people might be biased against moon colonization because of hidebound Luddite-ism, or an innate hominid preference for lush green forests and grasslands, or a pessimistic near-termism that rejects with payoffs more than a few years out. I personally might be biased towards moon colonization because I’ve been infected with the general Silicon Valley technophile mindset; or I personally might be biased against it because I’m a Democrat and Trump’s been the loudest modern proponent of more moon missions.
This is even easier if you’re allowed to invent biases on the spot. For example, I said people are against moon colonization because of “hidebound Luddite-ism” – is that actually a thing? If I say that regulatory action against tech companies is driven by anti-tech populism, have I identified a bias, made up a bias, or just tautologically rebranded people wanting regulation of tech companies as a force biasing people towards regulation of tech companies? Won’t people who support regulation, counter by saying that opponents are just knee-jerk technophiles in who have drunk some sort of Silicon Valley hype Kool-Aid?
2. Everyone Is Hypersensitive To Biases Against Their Side, And Thinks Biases In Favor Of Their Side Are Irrelevant
This is called the hostile media effect, though it’s broader than just the media. I’ve talked about it before in against bravery debates. My favorite example is conservatives complaining that the media condemns far-right terrorism but excuses Islamic terrorism (eg 1, 2, 3, 4, 5) alongside liberals complaining that the media condemns Islamic terrorism but excuses far-right terrorism (eg 1, 2, 3, 4, 5).
Or if you prefer facts to anecdotes: according to a Gallup poll, conservatives are more likely to believe the news has a liberal bias; liberals are more likely to believe the news has a conservative bias. In a study where experimenters showed partisans a trying-to-be-neutral video on the Israel-Palestine conflict, the pro-Israel people said the video was biased toward Palestine, and the pro-Palestine people said the video was biased towards Israel.
This ties into the problem where you can just make up a bias, like “hidebound Luddite-ism”. Technophiles will see an anti-tech bias everywhere. And whenever they meet a specific anti-tech person, they can assume that their positions have been shaped not by reason, but by the anti-tech sentiments that are omnipresent in our society. Having explained away their opponents’ position as the product of bias, they’ll feel no need to debate it or question whether it might be true.
Anyone can come up with any bias for any position, but this meta-bias is going to affect people’s sense of which biases matter and which ones don’t. Pro-moon-colonizers are going to doubt that technophilia is really a problem motivating people’s reasoning, but think that hidebound Luddite-ism is a big problem motivating everyone on the other side.
3. It’s Hard To Even Figure Out What Bias Means Or When It Is Bad
Suppose A and B are debating some issue, and B is part of a group especially closely linked to the issue. For example:
1. A plumber and a teacher are debating a proposed pay cut for teachers.
2. A man and a woman are debating abortion.
3. An atheist and a Jew are debating the peace process in Israel.
4. A white person and a black person are debating slavery reparations.
5. A citizen and an undocumented immigrant are debating immigration policy.
6. King Edward and a Jew are debating whether to expel all the Jews from England.
7. You and a KKK Grand Wizard are debating whether the KKK should be banned as a hate group.
8. A scientist and a tobacco company executive are debating whether cigarettes are dangerous.
Who is more biased? A or B?
This is a tough question. If we’re just working off the dictionary definition of bias, it ought to be B. But in cases like 6, it would be pretty bad to adjust away from B’s opinion, or discount B as too biased to give a good argument.
We can’t dismiss this as “A is also affected by the issue”. It’s true that for example the plumber may lose a little money if he has to pay higher taxes to fund increased teacher salaries. But since there are fewer teachers than taxpayers, each taxpayer’s loss is much smaller than each teacher’s gain. It still seems like B should be more biased.
We could model this as two opposite considerations. A is less biased. But B may be better informed. Sometimes this is literal information: I’d expect an immigrant to know more about immigration policy than an average citizen. Other times it can be emotional “information” about how something feels; for example, a woman may have hard-to-communicate information about what makes abortion rights feel important to her.
Is it meaningful to say the Jew has hard-to-communicate information about how much he doesn’t want to be kicked out of England? Or should we just say that, as the person most affected by the policy, he’s more likely to be thinking about it clearly? But now we’ve come full circle to saying that motivated reasoning itself is good!
I have a hard time squaring this circle. The lesson I take is that it’s easy to switch between “we should trust the more affected party less” and “we should trust the more affected party more” without clear principles to guide us.
Probably most people will do this in a biased way. When their side is the more affected party, they’ll say that gives them special insight and so other people should back off. When they’re the less affected party, they’ll say that makes them unbiased and other people are just motivated reasoners. This is yet another reason to expect that bias arguments have so many degrees of freedom that everyone will figure their opponents are biased and they aren’t.
4. Bias Arguments Have Nowhere To Go
Most people are already aware of their potential biases. No straight man will be surprised to be told that they are a straight man, or that this might bias them. “You are a straight man, so consider that you might be biased” doesn’t give new information. It just deflects the conversation from potentially productive object-level discussion to a level which is likely to sound patronizing and overly personal, and which has less chance of being productive.
Someone asks me “Are you sure you don’t just hold that opinion because of the liberal Jewish milieu you grew up in?” I look deep into my brain, the opinion still sounds right, I don’t see a sticker on the opinion saying “Proud product of the liberal Jewish milieu you grew up in”, and…then what? Do I drop the opinion even though it still seems right? Do I keep holding the opinion, but feel guilty about it? Do I retort back “Aha, no, you only hold your opinion because of the conservative Gentile milieu you grew up in, so you should drop your opinion!”?
There’s a sense in which we should always be considering the Outside View (see part III here) for each of the opinions we hold. That is, on the Inside View, the opinion might still seem convincing, but on the Outside View, we might have enough circumstantial evidence that it was produced by some process uncorrelated with truth that we doubt it despite its convincingness. But just learning that there’s some possible bias should rarely have much of an effect on this process, especially since with any self-awareness we should probably have already priced all of our own biases in.
5. Where To Go From Here
I think low-effort (and even medium-effort) arguments from bias will usually be counterproductive. Second person bias arguments (“You are probably biased on this topic because X”) and third-person bias arguments (“Society is probably biased on this topic because X”) are at least as likely to perpetuate biases as to help overcome them, and less useful than just focusing on the object-level argument.
What’s left? Bias is an important obstacle to truth-seeking; do we just ignore it? I think bias arguments can be useful in a few cases.
First, it’s fair to point out a bias if this gives someone surprising new information. For example, if I say “The study proving Panexa works was done by the company producing Panexa”, that might surprise the other person in a way that “You are a straight man” wouldn’t. It carries factual information in a way that “You’re a product of a society laden with anti-tech populism” doesn’t.
Second, it’s fair to point out a bias if you can quantify it. For example, if 90% of social scientists are registered Democrats, that gets beyond the whole “I can name one bias predisposing scientists to be more liberal, you can name one bias predisposing scientists to be more conservative” arms race. Or if you did some kind of study, and X% of social scientists said something like “I feel uncomfortable expressing conservative views in my institution”, I think that’s fair to mention.
Third, it’s fair to point out a bias if there’s some unbiased alternative. If you argue I should stop trusting economists because “they’re naturally all biased towards capitalism”, I don’t know what to tell you, but if you argue I should stop trusting studies done by pharmaceutical companies, in favor of studies done by non-pharma-linked research labs, that’s a nice actionable suggestion. Sometimes this requires some kind of position on the A vs. B questions mentioned above: is a non-Jew a less biased source for Israel opinions than a Jew? Tough question.
Fourth, none of this should apply in private conversations between two people who trust each other. If well-intentioned smart friend who understands all the points above brings up a possible bias of mine in a spirit of mutual truth-seeking, I’ll take it seriously. I don’t think this contradicts the general argument, or is any different from other domains. I don’t want random members of the public shaming me for my degenerate lifestyle, but if a close friend thinks I’m harming myself then I want them to let me know. I’m realizing as I’m writing this that this paragraph deserves its own essay, and that it would probably be a better and more important essay than this one is.
Most important, I think first-person bias arguments are valuable. You should always be attentive to your own biases. First, because it’s easier for you; a rando on Twitter may not know how my whiteness or my Jewishness affects my thought processes, but I might have some idea. Second, because you’re more likely to be honest: you’re less likely to invent random biases to accuse yourself of, and more likely to focus on things that really worry you. Third, you have an option besides just shrugging or counterarguing. You can approach your potential biases in a spirit of curiosity and try to explore them. I think I’m probably biased against communism because many communists I met have been nasty people who tried to hurt me, so I try to solve that by reading more communist books and seeking out good communist arguments wherever I can find them. Second- and third-person bias arguments risk feeling some kind of awkward option to change your opinions to something you don’t really believe in order to deflect someone’s bias accusations. First-person bias arguments should lead to a gradual process of trying to look for more information to counter whatever motivated reasoning you might have.