comments_image Comments

6 Surprising Scientific Findings About Good and Evil

Harvard's Joshua Greene on the evolution of morality—and why humanity may, objectively, be getting better in the long run.

Photo Credit:


The following post first appeared on  Click here to subscribe. 

Maybe you already know the famous hypothetical dilemma: A train is barreling down a track, about to hit five people, who are certain to die if nothing happens. You are standing at a fork in the track and can throw a switch to divert the train to another track—but if you do so, one person, tied to that other track, will die. So what would you do? And moreover, what do you think your fellow citizens would do?

The first question is a purely ethical one; the second, however, can be investigated scientifically. And in the past decade, a group of researchers have been pursuing precisely that sort of investigation. They've put our sense of right and wrong in lab, and even in the  fMRI machine. And their findings have begun to dramatically illuminate how we make moral and political decisions and, perhaps, will even reshape our understanding of what morality is in the first place.

"The core of morality is a suite of psychological capacities that enable us to get along in groups," explains Harvard's Joshua Greene, a leader in this research and author of the new book  Moral Tribes: Emotion, Reason, and the Gap Between Us and Them, on the latest episode of the  Inquiring Minds podcast (listen above). The word "group" here is essential: According to Greene, while we have innate dispositions to care for one another, they're ultimately limited and work best among smallish clans of people who trust and know each other.

The morality that the globalizing world of today requires, Greene argues, is thus quite different from the morality that comes naturally to us. To see how he reaches this conclusion, let's go through some surprising facts from Greene's research and from the science of morality generally:

1. Evolution gave us morality—as a default setting. One central finding of modern morality research is that humans, like other social animals, naturally feel emotions, such as empathy and gratitude, that are crucial to group functioning. These feelings make it easy for us to be good; indeed, they're so basic that, according to Greene's research, cooperation seems to come naturally and automatically.

Greene and his colleagues have shown as much through experiments in which people play something called the " Public Goods Game." A group of participants are each given equal amounts of tokens or money (say $5 each). They are then invited to place some of their money a shared pool, whose amount is increased each round (let's say doubled) and then redistributed evenly among players. So if there are four players and everybody is fully cooperative, $20 goes into the pool and $40 comes out, and everybody doubles their money, taking away $10. However, participants can also hold on to their money and act as a "free rider," taking earnings out of the group pot even though they put nothing in.

In one Public Goods Game experiment, Greene and his colleagues decided to speed the process up. They made people play the game faster, decide what to do quicker. And the result was more "moral" behavior and less free-riding, suggesting that cooperation is a default. "We have gut reactions that make us cooperative," Greene says. Indeed, he adds, "If you force people to stop and think, then they're less likely to be cooperative."

2. Gossip is our moral scorecard. In the Public Goods Game, free riders don't just make more money than cooperators. They can tank the whole game, because everybody becomes less cooperative as they watch free riders profit at their expense. In some game versions, however, a technique called "pro-social punishment" is allowed. You can pay a small amount of your own money to make sure that a free rider loses money for not cooperating. When this happens, cooperation picks up again—because now it is being enforced.

See more stories tagged with: