Would a non-human ‘perfectly rational’ alien, one that didn’t hyperbolically discount and had no tendency to automatically update their beliefs/moral system so that their past actions weren’t evil, still need to worry about slippery slope arguments?
Not from hyperbolic discounting or value drift. Maybe from other sources, like the coalition argument presented by Yvain.
Not from hyperbolic discounting or value drift. Maybe from other sources, like the coalition argument presented by Yvain.
You could still hit them with large rewards for making themselves less rational, and thus recreate the slippery slope argument along that axis.
Or large rewards for changing their utility function.