Haidt says, “intuitions come first, strategic reasoning second.” But how are intuitions and reason related? Historically, Plato thought emotions should be mastered by reason, considering emotions not very useful. Thomas Jefferson believed in an egalitarian relationship between reason and emotions, working together without one ruling the other. David Hume thought our passions ought to be slaves to reason, while recognizing that emotions are very useful. Haidt believes Hume best described the relationship between reason and emotions. The mind is like a rider (controlled processes; reasoning) on an elephant (automatic processes; emotions, intuitions). Haidt sees that the human brain evolved such that the rider came to serve the elephant. The phenomenon of “moral dumbfounding” tells us that when someone has strong emotions about what is right and wrong, they sometimes struggle to know why or justify those feelings with moral reasons. No matter what they come up with on the fly, their intuition doesn’t change. “You complete me,” says our heads to our hearts.
Also in favor of this view is that there is a part of the brain called the vmPFC (ventromedial prefrontal cortex) that is responsible for integrating emotions into controlled and conscious reasoning. When this part of the brain is damaged, our lives start to fall apart because while we can maintain our IQ and theoretically know what is right and wrong, we make anti-social and unwise decisions in our work and personal lives, becoming estranged in our social relationships. In other words, “gut feelings and bodily reactions [are] necessary to think rationally.”
For Haidt, his social intuitionist model of moral reasoning is a socialized version of Hume’s view. Moral reasoning is fundamentally social because it’s not “something people do by themselves in order to figure out the truth.” Part of the reason we engage in justifying our moral intuitions is to convince others to change their minds, which is especially true when it comes to political and moral issues. Haidt says that if you want to win an argument with someone, then talk to their elephant first.
If you ask people to believe something that violates their intuitions, they will devote their efforts to finding an escape hatch - a reason to doubt your argument or conclusion. They will almost always succeed.
One of the important insights for me is about how we morally interact with others. First, it’s important to know that, contrary to popular depictions, emotions are not the complete opposite of reasoning. Rather, “[e]motions are a kind of information processing.” The reason we make terrible social decisions when our vmPFC is damaged is that our brain lacks input from our emotions in the decision-making process. Our moral judgment, moreover, is a cognitive process. Both moral judgment and emotions are simply two different cognitive processes - moral judgments are controlled and emotions are automatic. Intuitions are the “rapid, effortless, moral judgments and decisions” we make every day.
Going back to the rider and elephant image, the rider is the mouthpiece of the elephant, serving it and skillfully providing the justifications and explanations for what the elephant does. But as humans, our elephants can influenced and our minds can be changed. Here’s how this works:
Some event triggers X’s intuition, then X makes a judgment, and then X justifies either their judgment or intuition (typically with a reasoned argument).
X’s justification (reasoned argument) now becomes the triggering event for Y’s intuition as X attempts to persuade them.
Y then makes a judgment, and then a reasoned justification or argument. But X’s judgment can also persuade Y’s intuition, and vice versa.
But Y can come back, after being triggered by X’s judgment, and challenge X’s judgment or argument.
This is when friends, especially, or others can do “for us what we cannot do for ourselves: they can challenge us, giving us reasons and arguments…that sometimes…change our minds.”1
When minds aren’t changed, this is why political conversations, and others, become very frustrating and annoying. Haidt says this happens because our “moral reasons are the tail wagged by the intuitive dog.” How do you know a dog is happy? Because it’s wagging its tail. The problem is that we try to wag the tails of our supposed political enemies through brute persuasion or argumentation, which typically doesn’t work. The tail of a dog communicates its emotions, not some reasoned argument. The illustration serves the purpose of telling us how to go about trying to change someone’s mind. Start by appealing to their intuitions. Contrary to being manipulative, this involves empathy. Henry Ford knew this very well:
If there is any one secret of success it lies in the ability to get the other person’s point of view and see things from their angle as well as your own.
Here are questions we might consider reflecting on:
How do you see this squaring with Scripture or whatever source(s) that you take to be authoritative?
How do your conversations typically go, especially the political or moral ones?
Do they match the progression of intuition, judgment, and then reasoned justification or argument?
How often is your mind changed? How do you go about it? How often are people trying to change your mind? How do they go about it?
How often do you try to change someone else’s mind? Does trying to change someone’s mind mean appropriate in your social group or culture?
Would empathy help you or others have better conversations?
Jonathan Haidt, The Righteous Mind: Why Good People Are Divided By Religion and Politics (New York: Pantheon Books, 2012), 47. All the quotes can be found between pages 27-50.