Tyler Cowen introduced an important new idea in 2011 and gave it a name: the fallacy of mood affiliation. His idea is sound and important; the name he gives and even the way he defines it is faulty. Here’s Tyler’s original statement.
It seems to me that people are first choosing a mood or attitude, and then finding the disparate views which match to that mood and, to themselves, justifying those views by the mood. I call this the “fallacy of mood affiliation,” and it is one of the most underreported fallacies in human reasoning. (In the context of economic growth debates, the underlying mood is often “optimism” or “pessimism” per se and then a bunch of ought-to-be-independent views fall out from the chosen mood.)
Tyler gives 4 examples. The first 3 don’t fit his own definition. In all 3, people don’t appear to be choosing the mood. Instead, they have come to a certain conclusion about something, a conclusion that might be justified, and then dismiss contradictory evidence.
Consider his first example. (I won’t bother going through them all.)
1. People who strongly desire to refute those who predicted the world would run out of innovations in 1899 and thus who associate proponents of a growth slowdown with that far more extreme view. There’s simply an urgent feeling that any “pessimistic” view needs to be countered.
It could be that people have chosen a mood or attitude on this, but there’s a good chance that that’s not what’s going on. For one thing, it’s hard to choose moods. They tend to happen based on the interaction between certain views we have about the world and certain real or perceived facts about the world. Tyler’s right that there’s probably an urgent feeling that a pessimistic view in the above case needs to be countered. But there’s a good chance that it’s not because of the “chosen” mood. There’s a better chance that it’s because the person has reached some conclusions and doesn’t want to alter those conclusions to fit what might be a more complicated reality.
Again, I’m not criticizing Tyler’s point that there’s an important fallacy here. I’m saying that he’s got the name wrong and he even defines it wrong.
So what’s my name for the fallacy and what’s my definition? I might be able to do better but for now I think a better name is the “confirming evidence” fallacy. And the definition is: “Having reached a conclusion, possibly based on strong evidence, someone tries to counter all apparently contradictory or undercutting evidence by dismissing it.”
READER COMMENTS
Charles Mojkowski
Oct 5 2019 at 7:37am
I recall an HBR or Sloan article on “decision-based evidence making,” or something to that effect.
Kevin L
Oct 5 2019 at 8:08am
So is it distinct from Confirmation Bias? Or a special case of Confirmation Bias?
Chris
Oct 5 2019 at 8:13am
You are quicker at the keyboard than I am.
David Henderson
Oct 5 2019 at 2:50pm
It’s more general than confirmation bias. It’s looking for evidence that supports a view of the world, not just the specific view on a narrow issue.
Chris
Oct 5 2019 at 8:13am
How new is this, really? How new was it in 2011 even?
From the first sentence of the Wikipedia entry for confirmation bias:
“the tendency to search for, interpret, favor, and recall information in a way that affirms one’s prior beliefs or hypotheses.”
These concepts seem to be at least first cousins, if not siblings.
RPLong
Oct 5 2019 at 9:35am
I think mood affiliation is distinct from confirmation bias in the following way: Confirmation bias causes a person to exclude evidence that contradicts her already-held opinion, whereas mood affiliation causes a person to first choose an opinion based on her mood, and then subject it to confirmation bias.
I might refuse to consider good arguments for nuclear power out of confirmation bias; but I might be dead-set against nuclear power in the first place due to mood affiliation, the general fear of the technology or distrust of those who promote it, or even just the mood affiliation toward beliefs that “earth-friendly technologies are 100% renewable and never pollute.”
David Henderson
Oct 5 2019 at 11:32am
You’re making the point that Tyler’s making and my point is that it’s wrong. I don’t think people typically choose moods, as I pointed out in my post.
Robert
Oct 5 2019 at 3:54pm
Is the sticking point here the word “choose”? Perhaps we should simply say that people end up (however that happens) in one mood or another and then choose their beliefs accordingly. Would that address your issue? I think the key thing is that many actions are driven by emotions rather than reason or beliefs.
RPLong
Oct 7 2019 at 9:03am
Oddly enough, I think I’m closer to your view than to Cowen’s. I agree that Cowen’s name for this is misleading. The way I’ve always interpreted it is that people choose a certain ethos or a certain idiom, and then once they’ve identified that, choose from a set of theories that reinforce that ethos or idiom. This sounds to me a lot like what you’re saying, but maybe I haven’t gotten it quite right. “Ethos affiliation” might be closer to how I view it.
David Tufte
Oct 5 2019 at 7:29pm
I wholeheartedly agree David. I think Tyler is onto something, and I’m not sure I know what it is either (I did ask him an email once, and the answer I got still wasn’t satisfying). But I like it, and I think it’s important. Yes, cognitive dissonance, and it’s been bugging me these last 8 years.
I do like RPLong’s interpretation. Perhaps “mood” is the wrong word. Mood seems to short-term for me. But I think “affiliation” might be the perfect word: loose but with the root leaning towards some form of love.
It’s definitely not confirmation bias. RPLong is right that that comes along later. And while I think your “confirming evidence fallacy” is interesting, I wonder how often “strong evidence” is part of anyone’s thinking. So I think you’re getting at something that might describe what us blogosphere people do, but not the (sometimes quite smart) people who don’t think that way.
How about “perspective affiliation”? One sometimes adopts a perspective towards how the world works, and does so largely on the basis of emotional valence. Then the dismissiveness towards contradictory or undercutting evidence kicks in.
What I’m envisioning is something like, say, Krugman dismissing evidence that, say, Piketty, might be wrong.
nobody.really
Oct 8 2019 at 1:32am
Eh. I wonder if Cowen is simply describing a psycho-social phenomenon–tribalism–as seen through the prism of libertarianism? People identify their tribe, adopt the attitudes of others in their tribe, and adopt the beliefs that tend to put their tribe in a flattering light and lead to the conclusions that favors their tribe’s interests.
While some people (a petulant teenager?) may adopt a cynical pose regarding everything in the world, most people who adopt such a pose also have topics upon which they speak enthusiastically and earnestly. That is, most people aren’t uniformly cynical; they’re cynical about topics that their tribe tells them to be cynical about, but not otherwise.
That said, I know someone with a curiously sour disposition. Complaining is just her style of conversation. She delights when the home team in losing, apparently because it provides more fodder for discussion (and puts more people in a foul mood, too?). Most of the time, it’s not hard to get along with her; you just find something you can jointly complain about. So in HER case, perhaps I would acknowledge a kind of mood affiliation.
Gerald Biggers
Oct 8 2019 at 2:12am
David,
This theory or the examples of it that you cite (that do not seem to fit his definition), seem to be a restatement of a fairly old (but still often discussed) theory in social psychology, developed by Leon Festinger, his “theory of cognitive dissonance.” and also the term, “rationalization,” (in dynamic psychotherapy techniques of Alfred Adler and also, Sigmund Freud).
Comments are closed.