Eureka! Economic Illiteracy as Mental Substitution
Here’s another revelation from Kahneman’s Thinking, Fast and Slow, from his chapter on “Answering an Easier Question.” The lead-in:
A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it…
His next step:
I propose a simple account of how we generate intuitive opinions on complex matters. If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution…
Consider the questions listed in the left-hand column of table 1. These are difficult questions, and before you can produce a reasoned answer to any of them you must deal with other difficult issues. What is the meaning of happiness? What are the likely political developments in the next six months? What are the standard sentences for other financial crimes? How strong is the competition that the candidate faces? What other environmental or other causes should be considered? Dealing with these questions seriously is completely impractical. But you are not limited to perfectly reasoned answers to questions. There is a heuristic alternative to careful reasoning, which sometimes works fairly well and sometimes leads to serious errors.
[Kahneman’s Table 1]
The mental shotgun makes it easy to generate quick answers to difficult questions without imposing much hard work on your lazy System 2. The right-hand counterpart of each of the left-hand questions is very likely to be evoked and very easily answered. Your feelings about dolphins and financial crooks, your current mood, your impressions of the political skill of the primary candidate, or the current standing of the president will readily come to mind. The heuristic questions provide an off-the-shelf answer to each of the difficult target questions.
I had a eureka moment when I read this passage. Consider the economic illiteracy intro econ professors face behind every corner. How do students manage to combine such absurdity with such certainty? Via substitution. Faced with a genuinely difficult question, they answer a different, easier question, then conflate the answer to their question with the answer to your question. Like so:
[My Table 1′]
Does the minimum wage help
Would I be happy if employers gave
What policies will make Americans
What policies try to hurt people I
Is it bad to be fired?
How much will Obamacare improve Americans’ health per dollar spent?
What is the most efficient level of tax progressivity?
How much do I admire/envy the rich?
Needless to say, economists could argue at length about which substitutions students make when we confront them with challenging questions. Better yet, we could try to empirically – even experimentally – triangulate their substitutions. Whatever the specifics, though, substitution is a plausible explanation of not only the absurdity of many popular views about how the economy works, but people’s certainty about these absurdities.
P.S. Have you ever read a more elegant account of “heuristics” than this?
The target question is the assessment you intend to produce.
The heuristic question is the simpler question that you answer instead.
The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. The word comes from the same root as eureka.
Jan 9 2012 at 10:40pm
In the table you switched the entries for the president row.
Jan 9 2012 at 11:24pm
You knew this all along. The advice I remember most from taking your exams?
“Answer the question I ask, not the question you want to answer!”
Jan 9 2012 at 11:40pm
Is there any fallacy can’t be explained this way?
“Is the world round?” “Does the world look round to me right now?”
“Did evolution happen?” “Do I find it plausible that I’m descended from apes?”
Or, the universal example:
“Is X true?” “How do I feel about X being true?”
Jan 10 2012 at 12:06am
I think this might also explain why learning is so hard. We sometimes think of learning as acquiring rules and developing strategies to answer target questions in system 2. But if that’s the case, learning should be much more easily transferable between subjects than it really is.
Maybe learning is just adding to your inventory of heuristic questions, which you do by working with gradually less and less basic examples. That would also explain why experts are often very confident and very wrong when talking about subjects outside their field.
Jan 10 2012 at 12:14am
Wow. Ryan, that is brilliant. +1
Jan 10 2012 at 12:42am
I think Ryan is right. We almost always use System 1, but experts have better internalized heuristics in System 1 just ready to go.
Jan 10 2012 at 4:06am
When it comes to the question, “What policies will make Americans richer?” I suspect the heuristic most people use is more likely to be:
“What policies seem likely to make me richer?”
That would jibe well with the observation that we often overestimate the degree to which our own situations can be considered “typical.” Of course, different people (even those who are similarly situated) can answer that in different ways depending on things like their propensity to imagine their own happiness in the long run or the short run.
But call me an optimist I think more people hope for their own well-being than for the suffering of those they dislike.
It could be, however, that certain people consciously or unconsciously frame debates in a way to lead more people to a particular heuristic, which could by why we see some politicians focusing not on questions of national prosperity but of who is paying their “fair” share of taxes.
“Is X fair?” is a question that is almost always answered by checking one’s gut.
J Storrs Hall
Jan 10 2012 at 5:18am
I demur to your use of “simpler” in your ps. “Heuristic” was a technical term in computer science before it escaped into the conventional wisdom, and it referred to computational tractability in the algorithm, not simplicity of the question. For example, a simple question, the traveling salesman problem, is “what is the shortest closed path through all these cities?” The algorithm is simple, too: try them all and compare. Heuristics involve asking a more complicated question (e.g. dividing the problem into subregions and recombining the answers) but one which can be answered with a shorter computation.
Jan 10 2012 at 7:29am
Just a note…when you say ‘economic illiteracy’, what you actually mean is ignorance/rejection of neoclassical theory. They are two separate things.
Jan 10 2012 at 9:12am
I second J Storrs Hall. (An AI cynic might remark that a heuristic is anything that makes a search faster.)
Second Phil too. Absent evidence this looks like a universal solvent: use it anywhere, removes any problem (but try to contain it).
Jan 10 2012 at 9:41am
Sounds like there’s an economic education article to be written here.
Jan 10 2012 at 9:47am
I suggest that ‘system 1’ is more concerned with signaling. A good example would be normally intelligent people talking about gun control laws: whatever the merits of such laws, the discussion concerning them is profoundly stupid. When system 1 presents an opportunity to signal how benevolent (for liberals) or loyal (for conservatives), it seems to become a bigger temptation.
Jan 10 2012 at 10:28am
I think this a well articulated intuition, but I don’t find it enlightening.
I’m more interested in why System 2 is so lazy and why we are so confident in our quickly generated answers to the substitute question.
Why is it so important that we feel we have to resolve the target question right now? Why don’t we say, “I’m not sure, I have to give that some more thought.”?
Is it because we don’t have the incentive to get it right? Is it because we encourage folks to have an opinion no matter what (e.g. get out and vote campaigns)? Does it come from the education model where we take a test and move onto the next subject, even if we didn’t do so well?
And even more importantly, how do we stop that?
Jan 10 2012 at 11:54am
This isn’t actually different from how academic economists work – the first column is reality and the second column is a “model”. The key (in both the academic world and real life) is to recognize how the two are similar and how they differ and, from that, make a judgment about how the solution to the real problem is likely to relate to the solution you can actually get out of the model.
Jan 10 2012 at 12:44pm
Actually, I don’t find the questions in Kahneman’s Table 1 hard to answer. So….?
“Dealing with these questions seriously is completely impractical.” That’s incorrect.
Perhaps the author meant “Dealing with these questions fully in under 30 seconds is difficult.”
To address the full article, which to be honest I only read once, are you just trying to say people are stupid and have a hard time with grown-up questions?
Jan 10 2012 at 1:55pm
Doesn’t the most accurate economic illiteracy table simply lay out that which is seen and that which is unseen? The left column asks to consider that which is unseen. The right column is a heuristic that considers only the seen. In that sense, it’s not that new.
People are asked an economic question: The illiterate happily answer the heuristic question about the seen because they know the answer. They need to be taught to consider the unseen.
Jan 10 2012 at 3:14pm
Great post, Bryan. I wonder if, after reading this book, you have reconsidered your hypothesis that people are rational for private decisions and irrational for public ones. It seems to me that people are pretty bad at both private and public decisions if it involves System 2 a lot (to stick with Kahneman’s terminology).
Jan 10 2012 at 9:39pm
I’m using my own heuristic to conclude that this is more evidence that agent based modeling is the future of social science. It’s easy to program varying decision heuristics for agents. Much more realistic than DSGE or other more elegant mathematical models.
Also, if you really like the heuristics concept, check out this book, which argues that heuristics are often more useful than complex methods. Somewhat elated note: I wonder why we don’t see more discussion of Hebert Simon and Thomas Schelling on econ blogs from people who favor micro foundations…
Jan 11 2012 at 5:36am
I have often wondered why so many economically right-wing parties are also socially conservative.
I wonder now if it is because they have difficulty selling complex right-wing economics to the masses for the same reason you show above. If the left are saying “we will give poor people free housing and education” and the right is saying “we won’t, for complex economic reasons, but please believe that you’re better off without them”, then perhaps the right need a simplistic narrative to convince poorer people that they are working in their interests? Hence the right push nationalism or religiosity. If they can’t convince poorer people that not giving them free money, education and health is actually good for them, then perhaps they can discredit the left by identifying them as being unsavoury, godless, cowardly, and disloyal? Bypassing the target question of complex economics and going for the easy heuristic question about things people feel strongly about about: religion, social norms, personal morality, etc.
Jan 11 2012 at 10:58am
You should check out the work of Gerd Gigerenzer on decision-making heuristics.
[here’s a free sample: http://citrixweb.mpib-berlin.mpg.de/montez/upload/PaperLibrary/GGBrighton_HomoHeuristicus-1.pdf%5D
He emphasizes a feature that can get lost in Kahneman’s account: Heruistics can give even better results (not merely good enough results) than attempts to directly answer the so-called “target question”, which sometimes aren’t even practically computable at all. E.g. an outfielder trying to calculate the landing point of flyball vs. using a simple “gaze heuristic”; same with dogs’ strategy to catch frisbees. Gigerenzer’s work has been put to use in medical decision-making with apparaently meaningful results.
[Comment was posted earlier today and was accidentally deleted. It was copied in full and is re-posted here. Sorry for the inconvenience.–Econlib Ed.]
Jan 11 2012 at 1:07pm
That’s fascinating, and makes me wonder whether the old intuition that geeks are bad at sports, and jocks need tutors has something to it after all.
Jan 11 2012 at 1:09pm
Ok, I have now added Thinking, Fast and Slow to my list of books to read. That’s pretty good.
On a related note, I have been trying to think through an idea on moralization. I think that in the process of trying to answer the big questions in life, they turn potential answers into moralizations. Take health care for example. People take a strong moral standpoint on either side of the issue, but in reality they are both attempts at solving a logistical problem. Further, I think most people get stuck arguing at the moralizing level and never get below to the logistical level.
Jan 11 2012 at 2:02pm
Can government action correct for economic actors’ systematic irrationality? | How much do I dislike the word “government”?
Jan 12 2012 at 12:47am
+2 on this. What a great proposition!
Jan 12 2012 at 3:27am
This is the essence of the global warming debate; there are three questions, 1) is the world getting warmer? 2) is it anything to do with humans? and 3) what should we do about it? They have been replaced with a catch-all heuristic of “shouldn’t we all use expensive alternatives to fossil fuels to save the planet?” Any attempt at discussion is then channelled into a discussion of question 1 and a debate on data with questions 2 and 3 already “decided”.
Jan 12 2012 at 2:58pm
I bought Gigerenzer’s book, “Gut Feelings”, and cannot say I recommend it. His “proofs” that intuition works better than rational thinking are not very convincing. First, no scientist that I have heard of (and most certainly not Kahneman) denies that there are some instances when rationalizing (what Kahneman would call ‘using system 2’) is worse than acting without thinking (playing baseball is an example).
But he tries to prove that using intuition is better than rationality in things that it clearly isn’t. He tries to “prove” that intuition beats rationality in picking stocks, because people choosing stocks randomly may outperform professional traders. That is, however, exactly what the literature on cognitive bias predicts. Rationality says that stock value fluctuates randomly, so there is no true “expert” on the subject. If, however, you do a lot of trading (which is what professional traders do), you’ll lose money.
He talks about deliberate strategies (even if simple) as if they were derived from intuitions. Kahneman’s book has a good take on the subject of intuition, I really recommend it.
Jan 13 2012 at 12:55pm
I actually went out and paid cash for a copy (hardcover) of TF&S
Kahneman is clearly a Nobel level guy, but the book has a lot of flaws; I would say that as a popular writer, he is not that good.
One problem with TF&S is that terms are not clearly defined and in the index; a more serious one is that if you think carefully about the experiments he describes, the evidence he provides is often not really conclusive [eg, for many experiments, he doesn’t show that people actually understand the math involved]; there is a whole cottage industry of people picking away at Kahnemann and Tversky, of whom Gigerenzer is one.
I read one of the pdfs on Gigerenzer’s website, and it made at least as much sense as Kahneman.
The one thing I did get from TF&S is how bad economic models were before behavioural economics; it is really un believable.
Jan 13 2012 at 5:55pm
This observation is very closely paralleled in the book Deep Survival by Laurence Gonzonales. (a book I have no interest in promoting other than that I love it.) Gonzales posits that we have two parallel processing methods when confronted with information that is relevant to our immediate survival: intellectual reasoning and emotional impulses. (roughly the equivalent of System 2 and System 1 above). The book well illustrates that there is both a compelling adaptive advantage to emotional reasoning in the face of danger but that that emotional (heuristic) processing can also kill you. The book is very good at showing how difficult it can be for very knowledgable and highly skilled people to fall prey to fatal heuristic thinking when they believe themselves to be thinking more clearly.
Then of course there’s HAL . . .
Comments are closed.