Editorial: Morality differs from person to person

We don’t often think too deeply about moral dilemmas, and I imagine you couldn’t care less about what I have to say on any given moral issue. That’s alright.

What I’m interested in is the fact that we undoubtedly have different views — the fact that, presented with the same set of moral dilemmas, you and the person closest to you will almost certainly differ on at least one. Morality is relative. That’s fascinating, but also problematic.

If you haven’t already had a debate with someone close to you because of a difference in moral views, you will in the future.

Those debates, I’ve found, can easily turn into arguments, not about the merits of one course of action over another, but about who is right.

That is what interests me: the idea that some moral views are better than others.

The idea that my moral stance on a given issue is right, and yours is wrong.

I’ve always thought morality was just a product of our society, and that evil for one might be a moral imperative for another. And that always made sense.

I liked that idea, because it meant that I didn’t need to figure out what was right overall , just what was right for me.

Consider AI. If a group of pedestrians suddenly walked onto the road, ignoring right-of-way, would a self-driving car swerve and risk hitting a pedestrian rightfully walking on a crosswalk?

But I realised that a simiple difference of opinion can cause a lot of conflict, and when that opinion is an integral part of one’s identity, as morality often is, that conflict can turn violent.

That was the realization that prompted me to think about moral diversity in the first place. Nobody questions the fact that two and two make four, so why can’t we have an absolute, unquestionably right set of morals?

After all, the rest of the universe has a definite system. The laws of physics are absolute. In physics, there are right and wrong answers.

I once tried to come up with a system like that, to see if morality could be simplified.

I thought that pain would make a reliable metric, since it’s so universal, but it fell short. Everyone ranks the pain of a situation differently.

I tried it using other metrics, too. They all had the same flaw: they were absolute at their core, but were measured differently by different people.

You and I may have the same moral end goal, but the path we choose to get there will almost certainly differ.

So why did I think this was worth writing about?

Consider healthcare. If you are injured and in extreme pain, will your doctor give you strong medication and risk addiction, or let you suffer to ensure your long term well-being?

Consider AI. If a group of pedestrians suddenly walked onto the road, ignoring right-of-way, would a self-driving car swerve and risk hitting a pedestrian rightfully walking on a crosswalk?

A study conducted by MIT, called Moral Machine, asked partcipants to choose a course of action in situations like the one described.

Those answers will be used to program the “morality” of self-driving cars. Our moral system directly impacts which people they will choose to kill or save.

One more scenario. Given our own moral diversity, imagine how different the morality of alien civilisations could be. What would happen if we met them?

Can we accept a wildly different moral system for the sake of peace? Or will we decide that we are right  and they are wrong?

To quote the Doctor, “sometimes the only choices you have are bad ones, but you still have to choose.” What do you do when you and the people around you choose differently? Food for thought!

Leave a Reply