Knowledge, Reality, and Value Book Club Replies, Part 1
Here are my reactions to Part 1 of the book club for Mike Huemer’s new book:
Thoughts/comments directed primarily towards Bryan:
Compared to other academic disciplines, philosophers really do spend a lot of time rehashing 2000-year-old debates.
I suspect this is at least in part due to the fact that of all the disciplines in the academy, philosophy is the one with the longest history. Chemists don’t rehash 2,000 year old debates because 2,000 years ago, chemistry wasn’t a thing. Historians, by contrast, do tend to spend more time rehashing, say, the history of the Roman Empire and other subjects that are thousands of years old, simply because history, like philosophy, stretches back that far as a discipline.
I don’t think the analogy works. Philosophers keep rehashing fundamental questions; historians, in contrast, rehash details. The consequentialism/deontology debate seems roughly comparable to “Did the Roman Empire exist?” Furthermore, by your logic, chemists are going to start rehashing fundamental questions in a few centuries. Seems highly unlikely.
I’m not sure that this is exactly a disagreement, but it seems relevant. On 6), about subjective claims: I think that we should avoid the words “objective” and “subjective” as much as possible. Why? Well, look at how they are often used in language:
–As rhetorical sticks with which to beat your interlocutor over the head. Compare “X is true!” with “X is objectively true!”. Adding objectively did not alter what was being claimed…
–As premature ends to what could have been a productive conversation. As examples, at multiple times in my life, I found myself trying to engage someone in a philosophical conversation about ethics, or free will, etc., only to run into the “X is just subjective” wall. This is especially annoying, because that’s assumed to be an uncontroversial fact, when it isn’t even among the people who study it the most (philosophers)…
It’s not that the subjective/objective distinction is entirely bogus; it’s just misused, and assumed to be more clear-cut than it really is…
All reasonable. Using objective as a synonym for “rational” or “logical” makes sense to me. So does objective as a synonym for “directly experienceable by multiple observers.” In that sense, my pain is subjective (only I can directly observe my own pain), but my left hand is objective (multiple people can directly observe it). But most facts declared to be “subjective” fit neither category.
Fashionable discourse about racism often commits the fallacy of affirming the consequent. ‘If an institution is racist, it has negative disparate impact on Blacks. Institution A has negative disparate impact on Blacks. Therefore, A is racist.’
The fallacy of affirming the consequent mistakenly interprets a sufficient cause as a necessary cause. In the example I gave above, disparate impact of institution A on Blacks might have main causes other than racism. The fallacy of the consequent regrettably short-circuits empirical inquiry and careful causal analysis.
Agree. Huemer seems more wrong than I realized on this point.
I am shocked you did not discuss Huemer’s ideas on the media starting on page 75! They seem to be in stark contrast with your own, and I was expecting you to mention them for sure. You trust the media when they make direct observations, but distrust them with regards to “causation, meaning, and importance.” I agree with this position. However, Huemer seems to distrust the media for even direct observations. Why? The example he uses is a case where the media got a direct observation wrong. They used the defendant’s lawyer as a source rather than the court, which appears to be a failure of direct observation rather than those three things…
Technically, this wasn’t a “direct observation” because the media was repeating someone else’s alleged observations without properly checking them. But this is still negligent reporting. How common does Huemer think such negligence is, compared to its actual frequency? I’ll let him answer for himself.
#4 Suppose someone holds a position: Policy X will have good effect Z and just does not mention bad effect Y or just dismisses it as negligible. A “strawman” counterargument would be that Policy X has a non-negligible effect Y and therefore X should not be adopted. The steelman argument would be that according to some weighing scheme that the proponent of X might accept, the sum of the good effects Z and bad effects Y is less than zero and therefore X should not be adopted. A super steelman argument would be that Policy X’ will have the good effects of X but without the bad effects Z and so X’ should be adopted instead of X. I think one should always use the steelman or super steelman argument and eschew the strawman argument entirely.
I don’t see how the final sentence follows. Suppose 99% of proponents of X hold the most naive view. Should we ignore them despite their prevalence?
On balance, I suspect that having a stern truth-seeking mentality is pragmatically useful compared to being a typical conformist, but the evidence is fairly weak. (I do however agree with Huemer that we have a prima facie moral duty to seek the truth even when the consequences are bad).
This is highly nonobvious to me and I would love to hear you explain why. It makes sense to me why decision makers have a moral duty to seek the truth about the topics relevant to their decision (the President or a CEO should be thoughtful and well informed). But why do ordinary people like me have a moral duty to seek truth about things which we have no hope of affecting? Why is it even supererogatory to seek truth in these cases?
I’m definitely not proposing a duty to spend lots of time or money learning useless things. What I’m proposing, at minimum, is that you should try to believe what is most likely to be true given the information you possess.
Does this really seem so strange? If someone accused you of, “Believing whatever makes you feel best,” wouldn’t this seem like a harsh criticism of your character? If someone claimed that, “X puts the truth first,” wouldn’t this seem like high praise?
Related: Would you find a prima facie duty to tell the truth (i.e., not to lie) implausible? If not, why isn’t a prima facie duty to believe the truth likewise plausible?
I believe that people speak of humor in a subjective sense and an objective sense, as they do with coolness, beauty, cuteness and other adjectives. “That was funny to me” is a coherent statement but so is “It’s funny but not to me.” Someone could also engage in a conversation about why “Most people think it’s not funny, but it is.” These statements seem semantically correct and not incoherent. If every English speaker died and no one could understand Dave Chappelle’s comedy specials, I would think that they remain funny.
I tend to agree, though I don’t know how I’d start convincing someone who disagreed.
Anecdotally, it’s pretty commonly recognized that persistence is one of the most important attributes for an entrepreneur or inventor.
I don’t want to spend too much time hashing this out in completeness, but here is another story, from roughly 20,000 years ago. It involves 2 tribes, the Pushovers and the Pigheads. They each had their Creative Genius. One time, the Genius from the Pushover tribe comes into the cave with a burning log and says “Look, we can control fire! We can use it to cook our food, have light to work inside our cave, and stay warm during winter!” The others objected. They didn’t want the smoke and it seemed to dangerous. So the Pushover Genius said, “oh well, you are right, it was probably a dumb idea.” That winter the Pushover tribe got wiped out by the Neanderthals. The Pighead Genius also came into the cave one time with a burning log and the same speech. She was presented with the same response. She then proceeded to demonstrate how to use the fire in a pit, but the others kicked her and her log out. Then she approached her friends one by one to convince them to let her try the fire in the cave. She made camp-fires and slept by them, and cooked her food. Over time, the resistance softened, and the other Pigheads came around to at least wanting to try this fire thing. They all survived the winter, beat the Neanderthals, and they became fruitful and multiplied. Obviously, being a Pighead is a survival trait.
Not if your new ideas is bad… as most new ideas are! What’s obvious, I’d say, is that there’s an optimal degree of openness for survival. Extreme Pushovers and Extreme Pigheads both put themselves in danger relative to following a Golden Mean.
The post appeared first on Econlib.