In the Preface for his new Knowledge, Reality, and Value, Mike Huemer engages in some humorous megalomania.  In response to the question, “Why read this book?,” Huemer states:

The author. I’m smart, I know a lot, and I’m not confused – which means you can probably learn a lot from this book. You probably won’t learn too many falsehoods, and you probably won’t run into too many passages that don’t make sense.

All accurate.  Huemer is very smart, does indeed know a lot, and is not confused.  And his book does indeed contain few falsehoods.  To keep this Book Club interesting, however, I’m going to focus on what I see as the major errors.  And in any case, if I focused on where he’s right, the Club would take months.

This week covers “Part 1: Preliminaries.”  Let’s dive right in.

1. Philosophical progress.

Myth #2: Philosophy never makes progress. Philosophers are still debating the same things they were debating 2000 years ago.

Comment: No, that’s completely false.

Huemer then presents some examples of (a) relatively new philosophical questions (like modal realism), and (b) debates that have largely been resolved in the eyes of professional philosophers (like the morality of homosexuality).

Strictly speaking, he’s obviously right, but I still say this “Myth” is insightful (and painful).  Compared to other academic disciplines, philosophers really do spend a lot of time rehashing 2000-year-old debates.  It is hard to imagine, for example, that consequentialism will ever conclusively triumph over deontology.

And if it does, it will probably be based primarily on conformity, not arguments.  Consider: Despite the current philosophical consensus, any halfway decent philosopher could easily construct half a dozen arguments for the immorality of homosexuality.  A utilitarian, for example, might oppose it for lowering birthrates, or simply creating a large social conflict for the benefit of a small minority of people.  And given the utilitarian framework, it’s not clear these arguments are wrong.

2. Validity versus soundness.

Huemer confesses, “I’m about to tell you why I hate the way my fellow philosophers (and I!) use the words ‘valid’ and ‘sound’.”  Namely:

I hate the philosophical usage of “valid” and “sound” because in normal English “valid” and “sound” both sound like they mean “perfectly okay”, or something like that.

While Huemer definitely offers some puzzling examples of valid arguments that fit the textbook definitions, I still say that it’s useful to distinguish between (a) arguments that fail because they contain false premises, and (b) arguments that fail because they don’t make sense on their own terms.  “Valid” and “sound” fit the bill, and I know of no substitute on the market.  And any substitute would probably be equally confusing, because English lacks everyday terms for this distinction.

3. “Truth is good for you.”

Huemer writes:

Truth is good for you. More precisely, knowing the truth is generally good for attaining your goals. For whatever goals you have in your life, it is almost always useful to have true beliefs…

Ordinary errors cause you to make ordinary, small mistakes. E.g., being wrong about what stores sell burritos causes you to waste time and not get your burrito. Philosophical errors, on the other hand, cause you to make bigger mistakes, like wasting your life.

As the author of The Myth of the Rational Voter, I have to call this a major overstatement.  Most people persistently hold many false beliefs, largely because most beliefs are barely related to any practical goals.  Furthermore, some important truths, including philosophical truths, are unpopular.  Which leads us to a major way that knowing the truth hinders the common goal of being well-liked by other people.  To spell things out: Holding unpopular truths often leads to the voicing of unpopular truths, which often makes people dislike you.

On balance, I suspect that having a stern truth-seeking mentality is pragmatically useful compared to being a typical conformist, but the evidence is fairly weak.  (I do however agree with Huemer that we have a prima facie moral duty to seek the truth even when the consequences are bad).

4. How to be objective.

For example, when responding to opposing views, you should respond to the most plausible opposing views and address the strongest arguments for those views – that is, the views and arguments that have the greatest chance of being correct while being importantly different from your own view. When you explain what your “opponents” think, try to state their views in the way that they themselves would state them.

What should you do, though, if almost all of your opponents believe in weak arguments for their own view?  If you put the strongest arguments in their mouths, you fail to “state their views in the way that they themselves would state them.”  I call this the Straw Man Straw Man, and it comes up often in political discussion.  My response is to start by criticizing popular arguments, then criticize the “steelmanned” position to cover my bases.

5. Frequent fallacies.

I’m not sure I’ve ever seen someone affirm the consequent or deny the antecedent. To the extent that the list identifies genuine errors, most of them are pretty dumb, so you probably don’t need much discussion of them.

A day or so after I read this, I read an argument that affirmed the consequent.  Unfortunately, I can’t remember what it was.  But trust me, it happened!

6. “Subjective claims” are underrated.  Huemer plausibly writes:

Roughly, a “subjective” claim is one that requires a judgment call, so it can’t just be straightforwardly and decisively established. For example, the judgment that political candidate A is “unqualified” for the office; the judgment that it’s worse to be unjustly imprisoned for 5 years than to be prevented from migrating to the country one wants to live in; the judgment that Louis CK’s jokes are “offensive”; etc…

Note: I am not saying that there is “no fact” or “no answer” as to whether these things are the case, or that they are dependent on people’s “opinions”. What I am saying is that there are not clear, established criteria for these claims, so it is difficult to verify them…

People often rely on subjective premises when arguing about controversial issues. The problem with this is that subjective claims are more open to bias than relatively objective (that’s the opposite of “subjective”) claims. So people with different philosophical (or political, or religious) views will tend to disagree a lot about subjective claims. And for that reason, they are ill suited to serve as premises in philosophical, political, or religious arguments. Advice: Try to base your arguments, as much as possible, on relatively objective claims.

Yet on reflection, it is hard to reconcile this with Huemer’s earlier advice to “Use weak, widely-shared premises,” when crafting arguments.  How so?  Because many claims that “can’t just be straightforwardly and decisively established” are also widely-shared – and many claims that can be straightforwardly and decisively established remain controversial!  “Being mean to children is worse than being mean to adults” is weak, widely-shared, and hard to straightforwardly and decisively establish.  “Communist governments murdered millions of people” is strong, narrowly-shared, and easy to straightforwardly and decisively establish.  I agree that subjective claims are less likely to be “weak and widely-shared.”  But given that you’re reasoning from weak and widely-shared premises, it’s hard to see any additional reason to avoid subjective premises.

7. The popularity of absolutism.

Beginning philosophy students sometimes want to know whether there is “absolute truth” or “objective reality”. These questions are not much discussed in contemporary, academic philosophy because there is not much disagreement about them among philosophy professors.

Later:

Philosophy professors, at least those from major research universities, tend to hate truth relativism. (Sometimes, we wonder where students learned relativism and what can be done about it. It wasn’t from us! Maybe they learned it in high school?) Why should we hate relativism?

So people like Richard Rorty are a tiny minority in academic philosophy?  (More a question than a criticism).

Like Huemer, I took many undergraduate philosophy courses at UC Berkeley.  While I don’t recall anyone promoting relativism, many professors spent most of their time arguing for radical skepticism.  Logically, saying “No one knows anything” is not the same as “Truth is relative.”  But once students feel like they have no way to reach the truth, it’s hardly surprising if they switch over to ersatz agent-relative versions of “truth.”  So maybe philosophy professors bear some collective guilt for this after all?

8. Since I’m basically out of disagreements for Part 1, let me end with a particularly excellent passage which I emphatically support:

Truth relativism does not just fail to be true, and it does not just fail to aim at truth; truth relativism actively discourages the pursuit of truth. How so? The relativist essentially holds that all beliefs are equally good. But if that’s the case, then there is no point to engaging in philosophical reasoning. We might as well just believe whatever we want, since our beliefs will be just as good either way. But this undermines essentially everything that we’re trying to do. When we teach philosophy, we’re trying to teach students to think carefully, and rationally, and objectively about the big philosophical questions (which hopefully will help you think well about other stuff too). When we do research in philosophy, we try to uncover more of the truth about these questions, so that we can all better understand our place in the world. All of that is undermined if we decide that it doesn’t matter what we think since all beliefs are equally good.

To repeat, please leave your questions in the comments for both me and Huemer.  I’ll respond later this week, and he’ll reply in due time.