Here’s a thought experiment which can help to measure how morally uncertain we are:

Omega offers that you may choose one of two things:

You can sleep for one hour less per night, every night this year, without any negative implications and will have the willpower to use this time to work to achieve your goals. Or, You get to have a “groundhog year”, repeat the current year, and use the first one to spend time clarifying your moral values.

What do you choose? For what X extra hours per night and Y years would you be indifferent?

Let’s presume you don’t go crazy repeating multiple years and that the utility of the marginal hours doesn’t diminish enormously with each one you add. Also let’s assume that “groundhog years” somehow don’t create any (dis)utility themselves. I’m sure there are many other problems with this thought experiment, and I’d be happy for someone to suggest an alternative. However, for consequentialist, maximizing theories it seems to work reasonably well.

Someone with a high X and a low Y is quite morally uncertain. They find it plausible that they will come to better values after more thought, introspection or however else they wish to discover new values.

“Better values”?

If we believe we may find values we consider to be “better” than the ones we currently hold we must be evaluating our current and potential future values against some standard or meta-value. We could also be uncertain about this meta-value, but at some point we hit bottom and must have some value that we are completely morally certain of. This could be something fairly complicated, such as “(my current best guess) + (X)(my best guess with Y amount of extra information)”.

There doesn’t seem to be any way in which your bottom-level value changing can ever be good, this is always value drift.

Value drift or reducing moral uncertainty?

Suppose Ruairi_2020 discovers a new ethical theory, new-tilitarianism, that better fits Ruairi_2014′s bottom-level value. However, during that time Ruairi also went through value drift and became completely selfish. Would I, Ruairi_2014, want Ruairi_2020 to be a new-tilitarian or behave selfishly? I would want him to be a new-tilitarian.

If your bottom-level value ever changes, this is value drift. If your value system changes to become more in line with your bottom-level value, this is becoming closer to what you think is ethical.

In reality both processes, value drift and finding values and actions more in line with your bottom-level value, simply involve becoming a new agent. Your current self just needs to decide (based on your bottom level value) through which processes it’s ok with the formation of the new agent.

When we take (the lack of) personal identity seriously, we see that future versions of you are separate agents, and you simply decide now which kind of process you want the new agents to be made through.

Arbitrariness

Your bottom-level value is defined by your decision. What privileges it over other preferences is that you decide to privilege it. At this most fundamental level, you cannot be mistaken about what you want.

For example, I try my best not to follow rampantly selfish preferences, though it’s definitely the case that I have them. We also (hopefully) decide to ignore other intuitions that are immoral.

For most people, an ethic or value system isn’t actually a description of our conflicting moral intuitions. Instead it’s something we decide to follow, often because it captures what we think is most important from a moral perspective, while also helping us avoid biases and remain consistent.

A value system is a consistent close approximation of things we care about. If you’re lucky, there will be a value system that you find satisfying and exhilarating. But there’s no reason why you might not ignore certain intuitions for the sake of consistency or for other reasons.

Deferring to your future self and others

Your future consciousness moments and the values they have may be different only in degree and not in kind from the conscious moments of other people.

If you wish to defer to the values of your future selves it may also make sense to give some weight to the values of those who you believe have come to their values through processes you have deemed to be those that bring people to good values.

By Ruairi Donnelly

