You should have used a "Memento" image here where *spoiler alert* the main character chooses what he will believe in the future even though he can't control it immediately. I believe we are all subject to that kind of strategy in a less direct way that that character
Might not doxastic voluntarism imply an existential regression problem à la Kierkegaard?
Once one admits the role of the will in belief formation in particular instances, it follows that it must also play the same role in forming beliefs about the very tools and structures one uses in choosing which beliefs to form, e.g. what constitutes a valid reason, the validity of formal systems of logic itself, etc.
At bottom, it neuters the nominal upshot of making beliefs the stuff of blameworthiness as to say "You should believe X" is really to say "I will it that you should will elsewise such that you will have chosen a chain of beliefs that will terminate in you willing X". Hardly an impassioned appeal, that.
Is this an epistemically nihilistic hell? Are them's simply the breaks? To paraphrase Hamlet, "There is nothing either good or bad, but willing makes it so."
"[The wise and virtuous man] remembers, with concern and humiliation, how often, from want of attention, from want of judgment, from want of temper, he has, both in words and actions, both in conduct and conversation, violated the exact rules of perfect propriety, and has so far departed from that model, according to which he wished to fashion his own character and conduct." (Smith TMS, 247–48.25)
While Socrates may be wrong if you interpret him as saying that no one willfully does things they take to be morally wrong, it's much more plausible if you interpret him as saying that no one willfully does things that they don't take to have the most reason in favor. While moral reason is one sort of reason, many people think it can be overridden by personal reasons or artistic reasons or any number of other types of reasons.
I also don't think the cult case provides a counterexample even to the strong form of doxastic involuntarism. It doesn't prove that *you* can choose to believe something you currently think is false - it just proves that there are some external stimuli that can cause you to believe something you currently think is false.
Of course, this leaves the main point of this post unscathed. We absolutely can voluntarily do things now that put us in positions to later end up involuntarily believing one thing or another, and so there is some sort of voluntarism about belief, even if only at a distance. It's not unreasonable to hold people (partially) responsible for this sort of thing.
For Ayn Rand, man has the power to focus or evade focusing. For Aristotle, the drunk is not responsible for what he does while drunk, but he is responsible for getting drunk.
I would suggest that people are unable to change their beliefs, UNLESS THEY ARE WILLING TO EXAMINE THEM. Once we reach a maturity level to recognize that we could be mistaken, and seek humbly to gain a more correct perspective, our beliefs will almost certainly change, and will continue to do so in at least an incremental fashion as long as we continue the process. Sadly, I have found that many people seem to be terrified of examining their beliefs, apparently from the fear of ever having to recognize or admit that they might formerly have been mistaken. As long as we allow fear of being wrong keep us from honestly examining our beliefs, we are locked into our errors, and unable to progress.
I alsk think the world can be very overwhelming and that beliefs are a way to simplify it. If people are overwhelmed their belief often fills the psychological need that your challenge to it can create, strengthening it
Hm. I find this post to have some points, and be innacurate in others?
For me the socratic position is more complicated then just "you inherently believe a thing, and cant choose that thing"
I see the things that affect your beliefs as more complicated then whether its just chosen or is involuntary.
Things that affects your beliefs that are mostly outside your control:
You biology and internal sensitivitys
Past experiences and traumas
Social reinforcement
Authority figures moves
Social status games in society (unless you are extraordinarly charismatic)
Random luck
What people you know
etc etc
I dont really think of peoples beliefs as inherently what they believe. I think they are rather ways of percieving the world, and trying to predict it, and fulfilling various needs. The people who are so ideological that they would die for their belief usually die way too soon for them to be a major part of humanity.
Different parts of peoples beliefs are also differently affectable.
And different people have different sensitivites and internal emotional incentives.
As for emotions getting the better of my beliefs: I genuilly dont know. I have some alexothymia and difficulty knowing my emotions. I have suspiciouns thought that antideppressants made me more libertarian as they made me less neurotic, and even now that im on less antideppressants the status quo effect and all the things ive read make me stay libertarian.
Nice discussion of treating belief as a form of conduct, and hence an object of moral sentiment. That points directly to estimative justice (estimating objects properly), which, along with commutative justice (not messing with other people's stuff) and distributive justice (making a becoming use of what is your own; proper beneficence), is one of the three justices in Smith.
Pushing along this line, which I favor, tends to bring epistemology into ethics. Epistemology becomes the matter of estimative justice as pertains to ideas, that is, when the objects being estimated is confined to ideas (as opposed to, say, my tennis racquet).
I think that is one of the key issues here: we can't compare beliefs directly, but only infer them based on what people say and what people do. However, beliefs evolve over time based on evidence, and so what we do to test and improve our beliefs is observable. Likewise, our stated beliefs and our actions are obviously at odds at times. ("I know I shouldn't do this, but...") That leads us to ask what is someone's actual beliefs, but at the same time it should lead us to question their logical processes and whether they are using the beliefs they act upon or the beliefs they profess when they are making decisions. If, say, their immediate actions follow one apparent belief, but their long term decisions follow another apparent belief, then they apparently hold contradictory beliefs, one of which must be false, but which they simultaneously believe.
You might seem to prove your point with a tautology: Any moral proposition that does not receive reasoned assent is by definition a "belief." But even that proof fails on the assumption that we can enjoy a "reasoned assent" from some godlike "Cartesian theater" when reason itself is tainted by what Hume calls the "passions." Then there is that other angle (mentioned by you as "the noble lie"), by which we with full reason assent to a lie, on the assumption that good consequences my nonetheless follow from it. E.g., G.K. Chesterton's remark that atheists reject religion not to believe nothing, but to believe ANYTHING – as amply proven by today's climate and fossil fuel Savonarolas.
Reason is a tricky quantity, and certainly not a "univocal term." There is a very smart astrophysicist now, one Jeremy L. England, who is simultaneously a quantum expert and a proselytizing Jewish rabbi. Is he reasonable or does he have a dissonance blind spot for the god of Abraham?
There may not be a collective reason, but likely there is a collegial one.
There is a more pervasive version of people only do evil out of ignorance in much public policy, especially health policy. It takes the logic that “if only people knew that x was bad for them, then they would not do x.” Thus we get many millions wasted on healthy eating campaigns, exercise campaigns etc. Thus trite messages dressed up as education justify considerable taxpayer expenditure.
Instead if saying “people want to make choices we dont like” we say “they make these choices because the are uneducated, and WOULD make the choices we like if they knew”
I've been trying to puzzle out all day why this sounds harsher than Yudkowsky's old "Avoiding Your Belief's Real Weak Points" in the Sequences. You've managed to add a sense of moral responsibility which is interesting.
You can, in fact, cut off your arm and people have done it. Aron Ralston is probably the most famous example, more controversial are the people with body dysmorphia.
I don't think I could will myself back into belief in God, even with stats showing the religious to be happier & healthier. Any attempt would run into the obstacle that I don't believe and would thus discount any argument in that direction. Actions are what we have control over, though as you note those can include actions such as not checking our preferred hypotheses for falsification while applying rigorous scrutiny to what we don't want to believe.
Heyyyy...I mentioned *doxastic voluntarism* to you in an email about your Ayn Rand post back in June, even including the distinction between direct DV and indirect DV! (I had wanted to post the comment to the site, but at that point at least Substack wouldn't allow comments on your imported EconLog posts.) I'm going to at least pretend that my email was a first step in excavating that concept from your long-term memory. :)
You should have used a "Memento" image here where *spoiler alert* the main character chooses what he will believe in the future even though he can't control it immediately. I believe we are all subject to that kind of strategy in a less direct way that that character
Might not doxastic voluntarism imply an existential regression problem à la Kierkegaard?
Once one admits the role of the will in belief formation in particular instances, it follows that it must also play the same role in forming beliefs about the very tools and structures one uses in choosing which beliefs to form, e.g. what constitutes a valid reason, the validity of formal systems of logic itself, etc.
At bottom, it neuters the nominal upshot of making beliefs the stuff of blameworthiness as to say "You should believe X" is really to say "I will it that you should will elsewise such that you will have chosen a chain of beliefs that will terminate in you willing X". Hardly an impassioned appeal, that.
Is this an epistemically nihilistic hell? Are them's simply the breaks? To paraphrase Hamlet, "There is nothing either good or bad, but willing makes it so."
"[The wise and virtuous man] remembers, with concern and humiliation, how often, from want of attention, from want of judgment, from want of temper, he has, both in words and actions, both in conduct and conversation, violated the exact rules of perfect propriety, and has so far departed from that model, according to which he wished to fashion his own character and conduct." (Smith TMS, 247–48.25)
While Socrates may be wrong if you interpret him as saying that no one willfully does things they take to be morally wrong, it's much more plausible if you interpret him as saying that no one willfully does things that they don't take to have the most reason in favor. While moral reason is one sort of reason, many people think it can be overridden by personal reasons or artistic reasons or any number of other types of reasons.
I also don't think the cult case provides a counterexample even to the strong form of doxastic involuntarism. It doesn't prove that *you* can choose to believe something you currently think is false - it just proves that there are some external stimuli that can cause you to believe something you currently think is false.
Of course, this leaves the main point of this post unscathed. We absolutely can voluntarily do things now that put us in positions to later end up involuntarily believing one thing or another, and so there is some sort of voluntarism about belief, even if only at a distance. It's not unreasonable to hold people (partially) responsible for this sort of thing.
For Ayn Rand, man has the power to focus or evade focusing. For Aristotle, the drunk is not responsible for what he does while drunk, but he is responsible for getting drunk.
I would suggest that people are unable to change their beliefs, UNLESS THEY ARE WILLING TO EXAMINE THEM. Once we reach a maturity level to recognize that we could be mistaken, and seek humbly to gain a more correct perspective, our beliefs will almost certainly change, and will continue to do so in at least an incremental fashion as long as we continue the process. Sadly, I have found that many people seem to be terrified of examining their beliefs, apparently from the fear of ever having to recognize or admit that they might formerly have been mistaken. As long as we allow fear of being wrong keep us from honestly examining our beliefs, we are locked into our errors, and unable to progress.
I alsk think the world can be very overwhelming and that beliefs are a way to simplify it. If people are overwhelmed their belief often fills the psychological need that your challenge to it can create, strengthening it
Hm. I find this post to have some points, and be innacurate in others?
For me the socratic position is more complicated then just "you inherently believe a thing, and cant choose that thing"
I see the things that affect your beliefs as more complicated then whether its just chosen or is involuntary.
Things that affects your beliefs that are mostly outside your control:
You biology and internal sensitivitys
Past experiences and traumas
Social reinforcement
Authority figures moves
Social status games in society (unless you are extraordinarly charismatic)
Random luck
What people you know
etc etc
I dont really think of peoples beliefs as inherently what they believe. I think they are rather ways of percieving the world, and trying to predict it, and fulfilling various needs. The people who are so ideological that they would die for their belief usually die way too soon for them to be a major part of humanity.
Different parts of peoples beliefs are also differently affectable.
And different people have different sensitivites and internal emotional incentives.
As for emotions getting the better of my beliefs: I genuilly dont know. I have some alexothymia and difficulty knowing my emotions. I have suspiciouns thought that antideppressants made me more libertarian as they made me less neurotic, and even now that im on less antideppressants the status quo effect and all the things ive read make me stay libertarian.
Nice discussion of treating belief as a form of conduct, and hence an object of moral sentiment. That points directly to estimative justice (estimating objects properly), which, along with commutative justice (not messing with other people's stuff) and distributive justice (making a becoming use of what is your own; proper beneficence), is one of the three justices in Smith.
Pushing along this line, which I favor, tends to bring epistemology into ethics. Epistemology becomes the matter of estimative justice as pertains to ideas, that is, when the objects being estimated is confined to ideas (as opposed to, say, my tennis racquet).
I think that is one of the key issues here: we can't compare beliefs directly, but only infer them based on what people say and what people do. However, beliefs evolve over time based on evidence, and so what we do to test and improve our beliefs is observable. Likewise, our stated beliefs and our actions are obviously at odds at times. ("I know I shouldn't do this, but...") That leads us to ask what is someone's actual beliefs, but at the same time it should lead us to question their logical processes and whether they are using the beliefs they act upon or the beliefs they profess when they are making decisions. If, say, their immediate actions follow one apparent belief, but their long term decisions follow another apparent belief, then they apparently hold contradictory beliefs, one of which must be false, but which they simultaneously believe.
Choosing to be dogmatic or to be intellectually heteronomous is not to choose one's beliefs.
https://jclester.substack.com/p/belief-and-libertarianism
https://jclester.substack.com/p/intellectual-autonomy-and-libertarianism
Intriguing but argumentatively messy. Willful errors happen all the time in computer science testing. But that seems to require practice as well.
Bryan,
You might seem to prove your point with a tautology: Any moral proposition that does not receive reasoned assent is by definition a "belief." But even that proof fails on the assumption that we can enjoy a "reasoned assent" from some godlike "Cartesian theater" when reason itself is tainted by what Hume calls the "passions." Then there is that other angle (mentioned by you as "the noble lie"), by which we with full reason assent to a lie, on the assumption that good consequences my nonetheless follow from it. E.g., G.K. Chesterton's remark that atheists reject religion not to believe nothing, but to believe ANYTHING – as amply proven by today's climate and fossil fuel Savonarolas.
Reason is a tricky quantity, and certainly not a "univocal term." There is a very smart astrophysicist now, one Jeremy L. England, who is simultaneously a quantum expert and a proselytizing Jewish rabbi. Is he reasonable or does he have a dissonance blind spot for the god of Abraham?
There may not be a collective reason, but likely there is a collegial one.
What happens if one goes the michael huemer route
And says “one cant control how things APPEAR to them”?
If BC still thinks the same things then that has some interesting implications
There is a more pervasive version of people only do evil out of ignorance in much public policy, especially health policy. It takes the logic that “if only people knew that x was bad for them, then they would not do x.” Thus we get many millions wasted on healthy eating campaigns, exercise campaigns etc. Thus trite messages dressed up as education justify considerable taxpayer expenditure.
Its kinda social desirability bias as well
Instead if saying “people want to make choices we dont like” we say “they make these choices because the are uneducated, and WOULD make the choices we like if they knew”
This can sometimes be true, but often not
I've been trying to puzzle out all day why this sounds harsher than Yudkowsky's old "Avoiding Your Belief's Real Weak Points" in the Sequences. You've managed to add a sense of moral responsibility which is interesting.
You can, in fact, cut off your arm and people have done it. Aron Ralston is probably the most famous example, more controversial are the people with body dysmorphia.
I don't think I could will myself back into belief in God, even with stats showing the religious to be happier & healthier. Any attempt would run into the obstacle that I don't believe and would thus discount any argument in that direction. Actions are what we have control over, though as you note those can include actions such as not checking our preferred hypotheses for falsification while applying rigorous scrutiny to what we don't want to believe.
Heyyyy...I mentioned *doxastic voluntarism* to you in an email about your Ayn Rand post back in June, even including the distinction between direct DV and indirect DV! (I had wanted to post the comment to the site, but at that point at least Substack wouldn't allow comments on your imported EconLog posts.) I'm going to at least pretend that my email was a first step in excavating that concept from your long-term memory. :)
OK, thanks for the link. Philosophy guides the use of the mind.