“To believe or not to believe? Economics provides a simple, almost trivial sounding, answer: believe something when the benefits of believing outweigh the costs, otherwise don’t.”

Have you heard this one?

A senior at Harvard had just written perfect answers to the first two questions of a three-essay history exam but found himself completely stumped by the third. So he took an exam book and drafted a letter (“Dear Mom: I have just finished my last final, and will soon be home for the holidays…”) and handed it in along with the first two essays. He then rushed back to his room, spare exam book in hand, and hurriedly researched the last question. He scurried to the prof’s office and, feigning panic, told him he had finished the exam early and began a letter to his mother—but mistakenly handed in the letter instead of the last essay. (“I’m so sorry…I almost mailed my third essay to my Mom!”) The prof fished through the exams, and, sure enough, there was the letter. He exchanged it for the essay, and the student got an A.

Hmmm…

  • Why do these stories always feature a student from Harvard, and never Brandeis, Northeastern, or University of Illinois?
  • He knew the first two questions perfectly, but had absolutely no idea what to write for the third?
  • What kind of prof is so stupid as to believe that students write letters to their mothers informing them of their plans to come home for the holidays? Wouldn’t the student arrive home before his letter did?
  • Isn’t this all just a little too good to be true?

Stories like this are fodder for folklorist Jan Harold Brunvand, who makes a living debunking these “urban legends.”

Whether we believe that particular story is probably no big deal. But fictions masquerading as facts take forms more serious than folktales: credulous patients risk their health with quack medicine; gullible investors risk their life savings in crazy investment schemes; unsuspecting teachers risk their students’ literacy with fad reading methods.

To believe or not to believe? Economics provides a simple, almost trivial sounding, answer: believe something when the benefits of believing outweigh the costs, otherwise don’t. Notice how a high-minded verb—”to believe”—starts to wither under the glare of economic analysis; the mundane science of choice seems to render everything to a widget-like state.

Good. Let’s run with that thought a minute. I predict that

  • right now, you and I each have something on our “top-ten” list of beliefs that’s not true
  • it’s probably a comparatively “low stakes” belief; that is, the price we pay for believing it is relatively low
  • we might change our minds as soon as it became worth it.

Believing a falsehood is not necessarily a dumb or crazy thing to do. It may well be the smart choice. After all, the truth is costly to unearth, so having more of it means having less widgetry and everything else; if we spent all our time checking facts there’d be no time left to earn a living, go to the beach, sleep. People don’t live by truth alone, they need money and fun too, and sometimes they can’t have more of everything.

Here’s an example of a recently believed untruth: millions of people—ordinary folks and experts alike—used to think a primary cause of ulcers was stress, but thanks to the pioneering work of Drs. J. Robin Warren and Barry J. Marshall we now know that the main culprit is not stress but bacteria. These guys didn’t have an easy time convincing people—driven to frustration by recalcitrant, unbelieving colleagues, one of them guzzled a nasty bacterial brew (Helicobacter pylori) and summarily entered his own sore stomach into evidence.

That was back in 1984. Nowadays the bacterial theory of ulcers is nearly medical orthodoxy, but its path to acceptance was rocky and slow. Even as recently as 1996, a significant minority of primary care physicians (18 percent) were prescribing ineffective or suboptimal treatments for patients suffering from H.pylori’s effects (e.g., giving them antacids but not antibiotics).1 What’s more, this was going on even while ordinary magazine readers like myself had learned the true story, as reported, for example, by Terrance Monmaney in the September 20, 1993 issue of the New Yorker.

The idea that wrong-headed beliefs can persist has not escaped the attention of economists. The year before Monmaney’s New Yorker article appeared, economists Sushil Bikhchandani, David Hirshleifer and Ivo Welch published a paper in the Journal of Political Economy called “A Theory of Fads, Fashion, Custom and Cultural Change as Informational Cascades.” BHW’s idea is simple: to make decisions in an uncertain world, we typically rely on two kinds of information: tidbits we dig up on our own and clues we pick up from watching others. A doctor, for example, could read Marshall and Warren’s 1984 Lancet article2 or imitate her colleagues. But—and here’s the rub—this practical-sounding strategy sometimes backfires by generating a mis-directed “informational cascade.” One person, then another, picks up a bad idea, others copy, and pretty soon the whole herd gallops off in the wrong direction. Not that doctors are dumb or lazy. Busy primary care physicians have to treat more than just gastrointestinal disorders, and who has time to read the journals cover to cover anyway? (Plus, sticking to standard operating procedure might be the best way to avert possible lawsuits.)

More generally, who has time to independently check every single fact? BHW argue, for example, that it would be absurd for everyone to verify personally that the world is round. Somebody’s already done that and the news is out; why not take their word for it and move on? In fact, we can thank these “good informational cascades” for most of what we know, since we’re more likely to learn things from reliable hearsay than to discover them all by ourselves.

BHW’s point though, is that things can go wrong. By their logic the specter of wrong-way stampedes lurks whenever people follow the crowd in search of what’s good: movies, restaurants, stocks, parenting techniques, you name it. Who hasn’t been burned by a rotten summer blockbuster because they followed the Cineplex crowds? In BHW’s terminology, they’ve gotten caught in an “incorrect Up cascade,” in which naïve throngs vote with their feet, giving an undeserved “Thumbs Up” (Big Toes Up?) to a clunker. Less apparent are “incorrect Down cascades”—good movies overlooked by the crowds to languish in obscurity. Almost by definition, they’re a lot harder to identify. (Here’s my candidate for the best scary movie you’ve never heard of: The Wicker Man (1973). If you like scary movies—and if you can find it, because it’s buried deep down the down cascade—rent it and tell me what you think.)

BHW use simple math to show that incorrect cascades are more common than we might think. They also supply real-world examples of cascade-fueled, silly fads. (One—tonsillectomies—is pure baby-boomer nostalgia. Who among us doesn’t know someone who had their tonsils out as a kid? Guess what? That procedure was mostly useless, sometimes harmful, and occasionally lethal.)

Even worse: people are so swayed by groupthink that they even let a crowd’s wrong perception trump their own correct view, as was famously demonstrated in social psychologist Solomon Asch’s experiments. Subjects surrounded by wrong answering confederates abandoned their own correct assessments of the length of a line in favor of the crowd’s judgment.3

And maybe even worse than that, it seems we all have an appetite for good yarns like the story of the Harvard student. As a colleague of mine once put it: “If it’s not true, it should be!”

What’s a rational person to do? One view is that cascades come part and parcel with our imperfect world; to escape a life of fulltime fact checking we have to accept a wrong turn now and then. This dilemma, after all, is just one of hundreds of economic tradeoffs that we routinely deal with by balancing costs and benefits. But applying ordinary cost-benefit analysis to beliefs can be illuminating.

For example: my childhood triumvirate of make-believe benefactors—Santa Claus, the Easter Bunny and the Tooth Fairy—didn’t topple all at once but instead piecemeal and in reverse order of their largesse. The Tooth Fairy, good for just a lousy dime, was the first to go, but Santa Claus (all those toys!) held on for longer than I dare admit. (OK, I gave him up at age 10. Go ahead and laugh, but I was just following Pascal’s gamble: believe in something if the expected benefits outweigh the costs.)

An optimistic corollary to the cost-benefit idea is that the wrong beliefs we do harbor are likely to be ones that harm us least. Thinking that “The League of Extraordinary Gentlemen” is a good movie does less damage than thinking the Yugo is a good car. Ulcer specialists—gastroenterologists—were far less likely to choose the wrong treatment than their generalist, primary-care counterparts. So urban legends are less likely to prevail in high-stakes situations. (But not always: 60 Minutes “expose” on the supposed “sudden acceleration” of the Audi 5000 in the 1980’s—a whopper of an urban legend if there ever was one—nearly put the company out of business.)

Let’s push the cost-benefit idea further to get a prognosis on beliefs. Consider the Internet: though a fertile breeding ground for hoaxes and rumors, its rise has nonetheless reduced information costs enormously in the dozen short years since BHW wrote their article. Diligent fact checkers have resources like ScamBusters.org and dozens of related sites. And it’s easy, for example, for patients to use the Web to learn about new medications before even their doctors find out about them.

Another way that technology marches on has to do with how science is conducted. Increasingly, the gold standard of randomized trials is supplanting other ways of gathering evidence, and helping truth supplant fiction. Most recently, the decades-long belief in the beneficial effects of hormone replacement therapy for post-menopausal women has crumbled under the weight of evidence from randomized trials.

Technological change can alter the costs and benefits of adhering to certain beliefs. Take the controversy about embryonic stem cell research, for example. The U.S. Roman Catholic Bishops oppose it, saying that life should be regarded as sacred from the moment of conception.4 But scientists, patient advocates (like Parkinson’s sufferer Michael J. Fox), drug companies, clinics, some politicians and some government health care officials are banking on the research program’s promise of treatment of and even cures for diseases like Alzheimer’s and type I diabetes.

Let me offer the following heavily caveated and qualified prediction about religious, philosophical and ethical beliefs surrounding stem cell research: I’ll bet that, over time, if embryonic stem cell research begins to deliver on its many promises, more and more people will lean toward Michael J. Fox’s perspective than that of the Catholic bishops. Not that I have any special knowledge regarding the truth of any of the arguments: I’m not a theologian or a doctor, and have absolutely no authority or special knowledge concerning the ethics, the philosophy, the theology, or the science of stem cells. That’s caveat #1. Caveat #2—I’m not saying that people should switch over to Fox’s perspective, I’m just saying that that’s probably what they will do. (There’s a big difference between using economics to predict what will or might happen—positive economics—and what should happen—normative economics. This discussion is squarely in the realm of the positive.)

To repeat: over time—and if further discoveries reveal increased benefits of embryonic stem cell research—more and more people will hold religious and spiritual beliefs consistent with a sanguine view of such research. Why?

Because if embryonic stem cell research produces additional health benefits, the opportunity costs of believing that a blastocyst is sacred will rise. Not everyone will switch, of course; maybe just a few. Some, perhaps most, will fervently adhere to the Catholic bishops’ position. Some will have no stake and will be out of the loop. But someone previously on the fence who learns she might benefit from such research has an incentive to lean toward favoring it.

This is the way economic analysis predicts the wind will blow. (Which, to repeat, is not the same as saying it’s the way the wind should blow.)

While technological advance can cause beliefs to change, there is much that it cannot do. It can’t settle matters of religious faith. That’s why views of the most fervent believers won’t change; for them, there’s no tradeoff. And scientific progress is a force that’s apt to create, rather than solve, thorny ethical issues.

Neither, I think, will technology ever be able to alter much our basic appetites or foibles. So even matters that could be settled by scientific evidence probably won’t be—so don’t expect all urban legends to disappear anytime soon. In fact, it’s a safer bet that old folk tales will just wind up getting supplanted by newer, more cost-effective ones.

For instance, let’s go back to Barry Marshall and helicobacter pylori. Just what did happen after he gave himself a stomachache with that nasty beakerful of bacteria? Did his self-experiment help prove that ulcers could be treated with antibiotics? Here’s one account:

After ten days gastroscopy and microscopy found that spiral bacteria had established themselves in Marshall’s stomach. On the fourteenth day, Marshall began taking the antibiotic tinidazole and his symptoms resolved within twenty-four hours.5

But wait, here’s a completely different one:

Marshall hoped to demonstrate that the bacteria could cause peptic ulcer disease. Marshall did in fact develop a severe case of gastritis, but the painful inflammation vanished without treatment.6

The first account is more interesting and triumphal (Plucky Doc Cures Self-inflicted Ulcer!), so my cynical bet is on the more humdrum second account.

So, where are those top-ten beliefs of yours? Like I said, I guarantee at least one is wrong. Here’s hoping it doesn’t matter too much.


Footnotes

Breuer T, Goodman KJ, Malaty HM, Sudhop T, Graham DY. “How Do Clinicians Practicing in the U.S. Manage Helicobacter pylori-Related Gastrointestinal Diseases?: A Comparison of Primary Care and Specialist Physicians.” American Journal of Gastroenterology 1998;93:553-561.

Marshall, B. J., & Warren, J. R. (1984). Unidentified curved bacilli in the stomach of patients with gastritis and peptic ulceration. Lancet, 1(8390), 1311-1315.

Asch, S.E. (1951). “Effects of group pressure upon the modification and distortion of judgments,” In H. Guetzkow (Ed.), Groups, leadership and men. Pittsburg, PA: Carnegie Press.

From the Congregation for the Doctrine of the Faith, Pastoral Instruction Donum Vitae (1987, I, 1): “…[the] human being is to be respected and treated as a person from the moment of conception and therefore from that same moment his rights as a person must be recognized….” The Catholic Church’s position does not rest on exactly when an individual human life begins, but out of prudence promulgates that, for policy purposes, the clock should be assumed to begin at the time of fertilization in order to maximize the benefit of the doubt accorded to the embryo.

Thagard, Paul. (forthcoming-b). Ulcers and bacteria II: Instruments, experiments, and social interactions. Studies in History and Philosophy of Science.

The Bacteria Behind Ulcers, by: Blaser, Martin J., Scientific American, 00368733, Feb96, Vol. 274, Issue 2


 

*Donald Cox is Professor of Economics at Boston College. His email address is donald.cox@bc.edu.

For more articles by Donald Cox, see the Archive.