I recommend the “triple issue” of Critical Review, entitled Is Democratic Competence Possible?. Its starting point is a 40-year-old essay by public opinion researcher Philip Converse called “The Nature of Belief Systems in Mass publics.” I am using this post to write up a few notes on the issue.

Converse found that most people, perhaps 90 percent, do not form their political beliefs on the basis of knowledge of current events and an understanding of the logical interconnections of different viewpoints. Only the elites, whom he calls “ideologues,” understand conservative and liberal ideology and have basic knowledge of, say, the identity of the Secretary of State.Gregory J. Wawro gives a good summary of the state of debate.

Members of the American electorate have been found to know very little factual information when it comes to government and politics. For that reason (according to Converse), many of them essentially have political non-attitudes: policy opinions that are random over time or internally inconsistent and hence not meaningful…voters may at best make poor choices, and at worst may be manipulated into supporting candidates or policies that will not serve their interests.

But all is not lost, according to many public-opinion researchers. They claim that neither factual knowledge of American government and politics, nor well-formed political opinions, are essential to making “good” political choices. The typical citizen can get by pretty well by using various cues that are easily and cheaply accessible. Furthermore, although things may look problematic at the individual level, the aggregation of choices and opinions eliminates the pernicious effects of ignorance and apathy, making the public as a whole appear “rational” or competent.

Editor Jeffrey Friedman’s introduction stresses what he calls “neglected implications” of Converse’s work. In particular, Friedman points out that the cognitive strategy of the elite, which consists of fitting political issues into a structured view of the world, is by no means perfect.

Converse’s political elites are particularly well informed about what it means to be a conservative or a liberal, and their reasoning about politics is structured by this knowledge. But Converse’s findings suggest, I think, that their relatively high levels of ideological knowledge are due to their being conservative or liberal ideologues: closed-minded partisans of one point of view.

…For Converse ([1964] 2006, 7, emph. original), “what is important is that the elites familiar with the total shapes of these belief systems have experienced them as logically constrained clusters of ideas.” But this experience does not stem from the ideologue’s astute reasoning or her keen investigation of reality. Her views are, instead, determined by the political belief system she has been taught. This worldview, in turn, has been concocted by a “creative synthesizer” of that belief system. Only a “minuscule proportion of any population” is capable of such creative syntheses (Converse [1964] 2006, 8).

…The adherents of belief systems, while a small fraction (e.g., 2.5 percent) of the mass public, nevertheless number in the millions, dwarfing the group of creative ideological synthesizers who generate the ideas merely repeated by their “sophisticated” followers.
Perhaps we should call the creative synthesizers “ideologists,” to avoid conflating them with the legions of “ideologues” who are their pupils…Ideologists lead. Ideologues follow. And the mass public, uninstructed in ideology, wanders.

…Yes, the ideologue may have predictable political attitudes, but should that be considered good?

…The more deeply rooted one’s causal theories (deeply rooted in one’s perceptions, not in the realities one is trying to perceive), the easier it will be to accumulate political information that fits those theories.
By the same token, one’s causal theories will tend to validate themselves. The aspects of the world that fit an ideology are the facts that its implicit causal theories make easy to spot and causally intercorrelate. The glaringly “obvious” profusion of this confirmatory evidence testifies, in the mind of the ideologue, not to her selective perception and retention of information, but to the accuracy of the theory (however inarticulate) that makes the evidence for it so “visible” to begin with. The ideologue’s growing stockpile of information thus functions as ammunition with which to repel challenges to the causal theories that have allowed her to accumulate the information in the first place. For this reason, being more informed about politics than most people are can actually mean being worse informed—if one’s causal theory is incorrect.

…in science, the proclivity to see in “the facts” only confirmation of one’s theories is overcome, to some degree, by the trial and error of controlled experiments that can falsify incorrect theories. The spiral of conviction regarding the truth of a given scientific theory eventually peters out, even if this requires the passing of the old “cohort” of scientific ideologues. Usually, no such corrective is available in politics.

Friedman goes on to say that Converse’s findings argue against making ideological interpretations of elections results.

The conventional political wisdom holds, 26 years after the fact, that we are in the (weakening) grip of a “conservative revolution” that was inaugurated by Ronald Reagan’s “landslide” 1980 election. Leaving aside the fact that 51 percent of the popular vote is no landslide, a reader of Converse will be suspicious of claims about a “tidal wave” of right-wing (or any other-wing) public sentiment bringing about an “era” of some consistent ideological stripe.Wherever there are ideological “attitudes,” we would expect them to be relatively stable, because of the constraining effect of the ideologies. At the elite level, then, it would be astonishing to find closed-minded ideologues converting to the other side overnight…

Not surprisingly, then, the survey data betray little hint of the vaunted conservative revolution (see Page and Shapiro 1992; Schwab 1991, ch. 2). On the basis of these data, it is safe to say that most voters had no idea what specific policies Reagan advocated, and would have disapproved of them if they had.

Critical Review reprints Converse’s original essay. I found particularly interesting one of his smaller points, about how the Nazi Party probably was not understood ideologically by most of its supporters.

it seems safe to conclude that the mass base of the Nazi movement represented one of the more unrelievedly ill-informed clienteles that a major party has assembled in a modern state.

Scott Althaus writes,

For most of recorded history the political ignorance of ordinary people was therefore more a “given” than a “crisis.” It was a natural condition rather than an immediate threat to the viability of political systems, and partly it was seen in this way because the health of political systems often was held to reside more in the hands of elites than the masses. Political ignorance was seen as a liability, to be sure, but it was a problem whose solution lay in proper institutional design, rather than something that invalidated the function or legitimacy of those institutions. The masses served mainly as a check to ensure that the elites didn’t fall too far out of line with community standards.

Samuel DeCanio writes,

once we recognize the extent of mass ignorance of democratic politics, the possibility of rule by autonomous state elites comes to seem not only more likely, but more extreme than extant theoretical frameworks suggest.

Ilya Somin argues that it may be rational to vote but irrational to gather information to vote intelligently. (Somin posts on the Volokh Conspiracy blog, for example here.) In his article, he writes,

Rationally ignorant voters are unable to keep track of more than a tiny fraction of all this government activity. Indeed, they probably would be unable to do so even with considerably greater knowledge than most of them currently possess. Other things equal, the greater the size and complexity of government, the greater the likelihood that many of its activities will escape meaningful democratic control.

Wawro (quoted earlier) argues that people adjust the facts to fit their political beliefs. His article is entitled, “The Rationalizing Public?”

One of my favorite examples…In February 2003, on the eve of the invasion of Iraq, an ABC News/Washington Post poll found that 72 percent of those surveyed believed that it was “likely” that “Saddam Hussein was personally involved in the September 11 terrorist attacks.”

…Self-identified Republicans were (and still are) far more likely to think there was a connection than were Democrats.

…[rationalization] occurs both among respondents who are politically active and those who have had little exposure to the media.

…Even in the small fraction of the public that has ideologically coherent attitudes, it is not necessarily the case that attitudes proceed in a straightforward manner from information that is then sorted into a coherent, usable form by belief systems. The attitudes, hence the ideologies, may be as much a cause as a consequence of the beliefs about factual “information” that are logically supposed to help determine one’s political choices.

Out of roughly 500 references in the volume, two intrigued me. One was John R. Zaller’s book. The other was a recent paper by Charles S. Taber and Milton Lodge.

Physicists do it (Glanz, 2000). Psychologists do it (Kruglanski & Webster, 1996). Even political scientists do it (cites withheld to protect the guilty among us). Research findings confirming a hypothesis are accepted more or less at face value, but when confronted with contrary evidence, we become “motivated skeptics” (Kunda, 1990), mulling over possible reasons for the “failure”, picking apart possible flaws in the study, recoding variables, and only when all the counterarguing fails do we rethink our beliefs. Whether this systematic bias in how scientists deal with evidence is rational or not is debatable, though one negative consequence is that bad theories and weak hypotheses, like prejudices, persist longer then [sic] they should.

But what about ordinary citizens?…we report the results of two experiments showing that citizens are prone to overly accommodate supportive evidence while dismissing out of hand evidence that challenges their prior attitudes. On reading a balanced set of pro and con arguments about affirmative action or gun control, we find that rather than moderating or simply maintaining their original attitudes, citizens – especially those who feel the strongest about the issue and are the most sophisticated – strengthen their attitudes in ways not warranted by the evidence.