Imagine yourself, or any other reader of this blog, competing these statements (B statements would preferably be about someone you don’t know personally):
A. I am an atheist because ______.
B. He/She is a Christian because ______.
A. I am an Anarchist because ______.
B. He/She is a statist because ______.
And so on; you can make up as many A and B statements as the number of belief systems or positions that you hold.
Here is what I think would be answers to A statements that most people would give:
“I have carefully evaluated the arguments for and against, and came to a rational conclusion.”
“It’s the right thing to do.”
“I choose not to believe in such claptrap.”
B statement answers generally look more like this:
“That’s the way he was raised.”
“That’s what’s acceptable in our society.”
“Ey’s an imbecile.”
In general, we attribute volition to ourselves by pointing to the internal factors which make us hold a given belief, but prefer to think that others hold opposite beliefs because of external factors (stating that someone is an imbecile also contributes to this, insofar as it implies gullibility, vulnerability to intellectual manipulation, and thus a certain degree of lack of agency). I chose to believe what I believe, they’ve never “really” chosen.
This is an obvious defense mechanism. We want to believe that the beliefs we oppose are held spuriously, because to us the beliefs are spurious. What if we honestly believed the opposition was just as self-motivated as we are? This would be rather troubling. For instance, it would mean that if our context was different, we might hold to those opposite beliefs that we find idiotic or repugnant, making us idiotic and repugnant as well. And who wants to have to face that?
I think there’s also a more simplistic form of reasoning there, which is basically that the stupider a belief appears to us, the stupider its proponent must be. I admit that this is a very attractive argument, but we all know that smart people (or at least educated people, which is not really the same thing) will readily believe stupid things.
There are exceptions, of course. I said that B statements should be applied to people you don’t know personally, because knowing someone personally makes you more likely to appreciate their intelligence, and thus make you more biased in favor of their volition. You are also more likely to understand why they hold the beliefs they hold, simply by having talked to them about it.
This is an interesting dynamic because it’s actually the opposite of how we usually attribute intent to ourselves and other people. For instance, suppose that you cut in front of someone in traffic. Now suppose that someone else cuts in front of you in traffic. In the first case, you’d usually invoke circumstances, while in the second you’d invoke bad faith. In fact, we humans take offense to the smallest action from other people even when it is most likely that those actions were unintentional.
The obvious difference is that, in the case of belief systems, we’re not attributing intent but rather cause. Were we to evaluate, for instance, whether something a religious person said which is considered offensive by atheists was said intentionally or not, then we would fall back onto invoking bad faith. But if we were wondering why ey believes in what ey said, we would invoke lack of volition. It is a subtle difference, but an important one. Few people deny the importance of volition, they just contend that it plays a more or less important role in the whole process of our intellectual lives.
The particularly interesting thing is when you ask ex-Christians why they believed what they believed. Insofar as I’ve seen, they immediately fall back to B answers and refuse to attribute volition to themselves! This is rather bizarre behavior which I can only interpret as their desire to distance themselves from their former beliefs. When you identify yourself as an ex-anything, I suppose that’s an understandable impulse, but at the same time it paints a paradoxical picture of the individual.
It’s very comforting to believe that we are in control of our own thoughts. We don’t want to be mindlessly following belief systems just because they have somehow infected our mind. And we don’t want to think of belief systems we find abhorrent as being anything but mindless infections.
Unfortunately, this contributes to the dehumanization of the opposition and the “us v them” mentality (we actually think about what we believe, therefore we’re in the right and their opinions don’t count). It can lead to self-delusion (i.e. if I am one of the blessed few who actually think about this topic, then I am probably right about everything else too). Indirectly, it can also lead to increased statism (i.e. we know what’s best for everyone).
Now, as I think I’ve made clear in past entries, I don’t believe in free will. I think attributing volition is, strictly speaking, wrong. Obviously I can’t speak for anyone else, but when I look at the things I believe now, and the things I used to believe, I see a natural, smooth progression which is fueled by the expansion of my knowledge. That is to say, everything I believe in is a result of who I am. The reason why I didn’t always believe what I believe now is because I was either misinformed or did not adequately know about these positions before now.
There’s no free will that I can see, no “choice.” I could no more choose to become a Christian, say, than I could choose to be a woman or a black person; furthermore, I have no reason to think that I am special in that regard. In fact, it’s always been a salient feature of my discovery of new ideologies that the people involved in them (people who presumably are the most “volitional,” in that they refuse to follow mainstream opinions) say things like “I’ve really always thought things like this, even when I was younger.” They know their belief, like all others, doesn’t come from any rational trains of thought, although any belief can be rationalized by rational trains of thought (with more or less success, obviously).
Oh, I know we secular science-loving types all like to flatter ourselves by saying that we believe what we believe because it’s backed by rational thinking, or that we are particularly rational people. Call it a pleasant little white lie if you want, or a mass delusion if you’re feeling more cynical. You could say it’s an illusion generated by our brains, or a statement backed by nothing but the hope that one will actually be rational in the future (perhaps something akin to a prayer?).
This is not to say that I don’t blame anyone for being willfully ignorant. People who are willfully ignorant are dangerous for others, and obviously we should stop people from being dangerous to others. This is merely a variant of the fact that even with determinism we should still stop people from committing crimes, in the same way that we stop non-biological machines from malfunctioning.
We don’t have free will. We don’t choose what we believe. All that we have affinity towards, whether material or intellectual, is fixed at birth, and finds its expression in our interactions with the world.
This may be seen by some as an oversimplistic explanation, given that people change their minds. But determinism does not preclude change; as I already pointed out, everything is in a state of flux, including our minds, and we should be rather surprised if our opinions never changed as a result. What determinism does indicate to us is that choice and rationality are not the reason why we adopt beliefs, even though they might provide extensive rationalizations for adopting a belief, and that there is no soul or self choosing beliefs detached from one’s personality, education and context of life.
As I already discussed, some positions are mainstream and well-known, some are not. This provides a huge complication to the whole process, especially since people can have rough, unformed insights into beliefs with which they have no formal contact because of their budding interests and/or personalities (I’ve had such an experience with atheism at seven years old, I’ve heard of others having such experiences grappling with other ideologies at a very young age). What it all means is that many people will inevitably change positions as they acquire knowledge, and that putting knowledge out there inevitably helps change some people’s minds, especially since they are interested enough to pursue it.
So, am I trying to make some kind of case that we should be nice to Christians or something? Not really. It may be that from a pragmatic standpoint we’re better off being all nice and accommodating to our opponents, but that comes with heavy disadvantages as well, and I’m no pragmatist anyhow. What I am saying is that we should be aware that we are lying to ourselves about how we, and other people, acquire and sustain beliefs, and that we should have more reasonable expectations. There isn’t really any way to extinguish belief in anything apart from eradicating all knowledge about it, and that’s impossible if only for practical reasons. The only way to address dangerous beliefs is by suppressing their expression, thus frustrating the believer, which carries its own dangers. The problem is fundamentally one of human nature, not an epistemic one, and no amount of rationality will solve it (as for my solution, well, hopefully you know what it is by now, if you’ve been reading this blog long enough!).