The Anti-Hansonian Heuristic
I often disagree with people who know more about a given topic than I do. For example, the typical economist who works in Industrial Organization is familiar with a lot more current antitrust research than I am. Robin Hanson notwithstanding, though, I’m not willing to defer to the IO expert’s greater knowledge. After all, I reason, if I did immerse myself in the modern literature, it’s a lot more likely that I would arrive at a sophisticated version of my current view than that I would radically change my mind.
When I argue with people who are better-informed than I am, then, I generally say the following: “I grant that you’ve seen a lot of evidence that I haven’t. But here’s my question for you: If I saw and read everything that you’ve seen and read, what would I conclude?”
Of course, the other guy could respond, “You’d agree with me,” but he rarely does. When you frame the question as I have, it’s often pretty clear that even though the disputants are not on a level playing field, that isn’t the real reason why they hold different views.
One nice feature of my heuristic: It actually makes disagreement more informative. Suppose you have a well-informed friend who heavily disagrees with you. Nine times out of ten, he admits that if you knew what he knows, you’d still disagree. But one time out of ten, he insists that if you knew what he knows, you’d change your mind. As long as you trust your friend, such a statement makes it reasonable to immediately adjust your belief.
I suspect that Robin Hanson will be disturbed by my heuristic. After all, it lets every person retain his view that his prior is “special.” You could even call my method the Anti-Hansonian Heuristic, because it deliberately ignores the fact that lots of smart people persistently disagree with you.
In response to Robin, though, I’d say that (a) it’s almost impossible to convince anyone that his prior isn’t special – and my heuristic improves the quality of beliefs despite this impasse; and (b) since my prior is special (laugh if you must!), this is a great heuristic for me to live by.
P.S. Another nice feature of my heuristic: You can occasionally run an audit. One time out of ten, you can actually put the better-informed person on the spot: “OK, share your evidence, and let’s see if I react to it as you predicted.”
Aug 18 2008 at 2:47pm
I tend to operate on the assumption that disagreements are the result of one side (or possibly both sides) not having the correct set of facts. When I disagree with someone, I try very hard to be open to the possibility that I’m the one who’s misinformed.
But if I am the one who’s the expert, I’m going to tend to assume that once provided with my expert level knowledge, Bryan would agree with me. (Perhaps he’d continue to disagree as a Devil’s Advocate.)
However, there is a scenario in which I might think that Bryan would continue to disagree after I presented him with my greater knowledge. That scenario is if I’m more informed than Bryan, but I know that I’m not terribly well informed in general. So for example, I might know a little bit about skydiving because I’ve jumped 2-3 times. But suppose Bryan knows nothing about it. In that case, I would not be confident enough in my own knowledge to think that I could convince him. Bryan, being a smart guy, might draw a different set of conclusions than I drew based on the facts that I shared with him.
So I tend to think my answer to Bryan would be determined by how confident I was about the correctness of my conclusions. The higher my confidence, the more likely I’d be to expect Bryan to agree with me when presented with my knowledge.
Aug 19 2008 at 1:54am
And I claim the exact opposite of what mjh does. I believe:
1) a large portion of disagreements people have are actually disagreements about values.
2) values have no truth value. That is, there’s no way I could possibly demonstrate to you that your values are ‘wrong,’ and in fact the entire idea is pretty meaningless.
Thus I think a lot of disagreements are fundamentally irresolvable regardless of the level of information present.
Aug 19 2008 at 3:02am
I agree with jadagul.
your approach only makes sense for political subjects (economics, sociology, history and so on)
If you disagree with a solid state physicist about the consequence of the Meißner-Ochsenfeld-effect or with your dentist about your teeth be clever change your mind.
Aug 19 2008 at 4:19am
If the disagreement is ultimately based on values, then you are probably right. But if the disagreement is factual in nature, then one or both of you is biased, and you should try to overcome that, rather than rationalizing your position. (Well, you can do whatever you want, but I would rather have the facts that stubbornly defend my position.)
Aug 19 2008 at 6:53am
From this post (and previous ones like it) you seem to be working on the notion of David Hume that “Reason is the Servant of the Passions,” e.g., clever people just come up with better justification for their prior held beliefs.
But intellectual life is supposed to be be a sanctuary from this stuff. “Follow the evidence wherever it leads!” say the anti-Caplanites.
You probably retort “That’s not how actual human beings work!”
And my reply, finally, is that the norms people have and the goals they aspire to are flexible (but not infinitely)–if you continually repeat the ethos that On intellectual matters, we follow Reason, you will get behavior that more closely adheres to that goal.
Isn’t that what you want, Brian?
Aug 19 2008 at 10:33am
Intelligence in one field does not eliminate ignorance in another. Embrace experts, but be skeptical. Seek out your own information from a variety of sources and demand from yourself both the courage to stick to your guns and the flexibility to change your own mind.
Ask yourself one question: Is it rational to assume that “if I did immerse myself in the modern literature, it’s a lot more likely that I would arrive at a sophisticated version of my current view than that I would radically change my mind.” Maybe. Is it rational to assume this is the always (or most likely) the case whenever you immerse yourself into another field? I don’t think so. Your current beliefs were most likely shaped by immersing yourself in a field at a younger age that may have changed your views then. Why the mental rigidity now?
I believe, however, that you probably won’t agree with me 🙂
(Long-time reader who defers to your economic experience/intelligence/opinions in most cases. My comparative advantage is found in other fields!)
Aug 20 2008 at 9:19am
@Jadagul: if disagreements are mostly about values and values have no truth value, then what is the point of arguing about them?
Blue is my favorite color. I don’t find myself spending a lot of time trying to argue that blue is the best color. It’s pointless to argue with someone who prefers red or green.
But when we disagree, and those disagreements turn into arguments, it seems to me the reason is because each side seems to think that the other side is missing some facts. If both sides knew that it was simply an issue of values, why would they argue?
Aug 20 2008 at 2:07pm
mjh: because people are under the mistaken impression that value judgments have truth-values? For instance, most people seem to be under the impression that there’s some objective reason why it’s wrong to murder people.
Aug 20 2008 at 3:39pm
@Jadagul: Who is the arbiter of which things are questions of fact and which are questions of value.
Aug 21 2008 at 1:37am
mjh: If you can prove it to be true in a way that no one could possibly disagree with it, then it’s a fact. 🙂 (“an argument will convince a reasonable man; a proof convinces an unreasonable one.”)
More generally, I have a sort of radically anti-objective epistemology. Direct sensory input is a fact. That is, I treat it as a fact that I perceive the existence of a computer screen, and I perceive that there are letters on it, and I interpret these letters to mean that you’re asking me a question. This is a fact. (At least it is for me; for you, it’s a fact that you see letters on the screen that you interpret as me claiming that these are facts for me). Everything else is interpretation or value.
Given any pattern of facts, I can come up with arbitrarily many theories that will match that fact pattern; therefore, I don’t think we can really have any evidence that a theory is true. (Old story I’ve heard attributed to Sartre: how would you know for sure that God exists and wants you to worship him? Suppose the limit case: An angel of the Lord appears to you, in his full glory, and tells you that God exists and wishes to be worshipped. And yet, it’s possible that you’re hallucinating. Or that the angel is an alien trying to trick you. Or that he’s talking to someone else–maybe God doesn’t care if you worship him, but really wants prayers from the invisible fairy two feet behind you. Or maybe you just have a memory error. And if an omnipotent God couldn’t prove to you that he exists, how do you expect to be sure of anything else?
Of course, God could just move some neurons around and make you believe he exists. But then, he could also move some neurons around and make you believe that he doesn’t. My point is that neither belief is ‘justified.’)
As such, I would also modify Brian’s actual argument. I privilege my prior not because it’s right compared to some objective ‘truth’ value; all priors are equally valid. I privilege my prior because it’s mine, and so I think it’s right. If I didn’t think it was right it wouldn’t be my prior. But I realize that if I had some other prior I would think that one was right, too.
But similarly, I value the things I value. I don’t think you can reduce things any farther than that.
Aug 23 2008 at 1:30am
Bryan, how do you square this position with your advocacy of deference to experts in your book?
It seems to me that your position relative to the expert in IO is analogous to the average citizen’s position relative to the economist. Okay, maybe there’s a larger gap in the latter case. Nevertheless, you have championed bowing to the superior wisdom of people with more informed and better educated views.
Comments are closed.