One issue which puts people off intuitionism is the issue of widespread disagreement: if morality is intuitive, then why do people, who presumably start from the same intuitions, disagree so much?
The first point to make here is that disagreement is not really as widespread as people believe. Yes, people profoundly disagree on important issues, but we focus on these issues precisely because there is profound disagreement. We don’t talk about all the issues on which we do agree because there’s no particular reason to do so, but this skews our perspective. If you sat down with a person whom you think as being completely opposite to you ideologically (such as, say, an ultra-conservative) and took a test on random moral and ethical issues, I think you’d find a great deal of agreement.
But profound disagreement does exist. How can this be explained from an intuitionist standpoint? Obviously the answer is that some people are not following their intuitions, or do not follow them in certain contexts.
One obvious example is religion. As physicist Steven Weinberg famously said:
Religion is an insult to human dignity. With or without it you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion.
I disagree that there is such a thing as “good people” or “bad people.” Actions are good or evil, but our capacity to be good or evil is vastly contextual. In fact, the example of religion is a great example, since no one is born with a religion and yet it does make people do evil things.
But apart from that, I agree that religion makes people’s morality much worse. We observe that when Christians deconvert and call themselves atheists, most of the bizarre ethical principles they used to believe just fade away. So how does religion do that?
The general answer to that question is ideological bias:
People who are biased are moved by emotions and desires, other than the desire to have true beliefs and avoid false ones, to view some claims favorably and others unfavorably. In other words, there are some things we want to believe, and others we want not to believe. Often, we want to believe what is in our interests to believe, or to convince others of; what we would like to be true; what would be in the interests of the social group with which we identify to believe; and what coheres with the self-image we want to maintain.
Michael Huemer, Ethical Intuitionism, p137-138
Connected to this, Huemer names the major sources of bias: self-interest, wishful thinking, group interests, self-image, cultural traditions, religion, and philosophy.
Take the example of religion, which incorporates most of these sources. There is an obvious element of group interests, since religion itself, as well as specific sects, forms a group of people who thereby share common interests (e.g. in the wider propagation of group ideas, in the construction of more places of worship, and so on). But beyond group interests, which are general in nature, there are specific doctrines and mores which have to be followed.
Both aspects are not separate but interact with each other. Doctrines provides ammunition for recruitment and the necessities of recruitment inform the way people worship (as is most exemplified by the New Testament stories, but also in the changes churches have made over the last decades). Likewise, other elements also interact. Changes in cultural traditions eventually force change in mores, because doing otherwise would make recruitment impossible.
Religion and philosophy (including politics) are definitely major sources of ideological bias. And yet not all ideologies are sources of ideological bias. So what makes religion and philosophy particularly conducive to ideological bias, but not some other ideologies?
There are three attitudes that a group can have towards people’s personal values:
1. It may seek to develop them and help them be expressed in the organization.
2. It may seek to persuade people to take a different course.
3. It may seek to impose a new set of rules on the individual from “on high” (whether from some human authority, divine authority, or some other form of authority).
The third attitude is what produces ideological bias, because it forces the individual to abandon eir intuitions in favor of dogmatic rules. Sometimes fixed ideas lie behind it, since they provide the leverage to silence individualism and dissent.
The Special Pleading comes into play because, outside of the domain occupied by that group, “real life” is still followed in spite of the doctrines that apply in that domain. So for example genocide is still mind-bogglingly evil in all other contexts, but in the context of the Bible genocide becomes praiseworthy because reliance on the Bible must be defended at all costs. Therefore the Christian has to use Special Pleading when ey looks at genocide in the Bible, which ey does not use when talking about genocide in any other context.
Such use of Special Pleading is the surefire symptom of ideological bias at work. If you ask people to validate a principle used while changing the context (in ways that are mot ethically relevant) and they balk, you know you’ve hit such a case.
I think it’s pretty clear that Special Pleading is by far the most powerful form of ethical error because it is generated by organizations, not by individuals, and therefore has far greater magnitude. One person here or there who engages in wishful thinking cannot compare to the impact of, say, the entire Catholic Church. Individual error is subject to peer pressure and other incentives in a way that an entire organization is not.
On the contrary, ethical errors in organizations have a tendency, in hierarchical organizations (which in our society means all organizations, unless they are explicitly Anarchist), to grow and feed on each other.
These errors can be implanted from the outside, in which case they are examples of co-optation, or they can be self-inflicted, in which case they are examples of groupthink. Once an organization begins seeking power or profit, it becomes more and more in its interest to adapt to the linear logic of how to obtain more and more of these things. And once an organization starts cultivating its own self-interested beliefs, it becomes more and more acceptable for people within the organization to accept and act upon these beliefs.
The process of groupthink develops new rules that become the norm, and anyone who refuses to adhere to the new rules becomes abnormal.
Freud had said in Totem and Taboo that acts that are illegal for the individual can be justified if the whole group shares responsibility for them. But they can be justified in another way: the one who initiates the act takes upon himself both the risk and the guilt. The result is truly magic: each member of the group can repeat the act without guilt. They are not responsible, only the leader is. Redl calls this, aptly, “priority magic.” But it does something even more than relieve guilt: it actually transforms the fact of murder. This crucial point initiates us directly into the phenomenology of group transformation of the everyday world. If one murders without guilt, and in imitation of the hero who runs the risk, why then it is no longer murder: it is “holy aggression. For the first one it was not.” In other words, participation in the group redistills everyday reality and gives it the aura of the sacred- just as, in childhood, play created a heightened reality.
Ernest Becker, The Denial of Death, p135-136
Psychologists have shown that people make these verbal switches when they’re in a we/they situation, in a your-group-versus-another situation…
If you’re a member of my group and you do something good, I make a general statement: “Noam Chomsky is an excellent person.” Now if you do something bad, I give a particular statement, “Noam Chomsky stepped on my toe.”
But it’s exactly reversed if you’re not a member of my group. If you’re not a member of my group and you do something good I say, “Noam Chomsky gave me directions to MIT.” But if he steps on my toe I say, “He’s a lousy organism,” or “He’s an inconsiderate person.”
So we generalize positively to ourselves, particularize negative and reverse it when we’re talking about other people.
The more people in a group think of their situation as “us versus them,” the more likely they are to reject attempts to correct nascent groupthink. Remember that one of the premises of manichean thinking is that whatever “we” do is right. Anyone who opposes what “we” do (even if it’s the wrong thing to do), logically, must therefore be against the group in toto. You are against the war in Vietnam, therefore you are anti-American. You don’t support the bullying of homosexuals, therefore you are anti-Christian. And so on.
Special Pleading is a gargantuan problem, and there’s no easy solution. But as I’ve already pointed out, intuitionism does provide us with a way to examine it and identify it: hypothetical scenarios. By isolating an action, and putting it in a different context, we can deduce the presence or absence of Special Pleading. In fact, atheists have been doing this for a very long time regarding Biblical atrocities and absurdities; but I think it’s such an intuitively obvious technique that pretty much everyone does it.