Let's say we've figured out exactly how to make a program controlled by a central goal/value-text. What would this text say? What values would we want to put into it?

Assume that it is very important to prioritize the values in the goal-text. This itself assumes that values might come into conflict and require resolution by deciding which is the more important value. For example, we have a hell-bent killer who's freedom we are denying in order to prevent (or indeed by the act of preventing) his openly avowed oath to murder his next-door neighbor. Now freedom is really important, but eventually it must be decided which value, freedom or life, is more important.

If there is some value that is more important than all other values, it must not only go first, but all other values are treated as if they say, "Maximize this but only if it does not in any way detract from all the preceding value statements."

So what value goes first?

As you may have already inferred fro…