valuing others’ not having abortions loses to their valuing choice, but the AI arranges things so that most pregnancies are wanted and it doesn’t come up often; valuing the torture of sinful children loses to their desire to not be tortured, and also goes away with a slight increase in intelligence and wisdom
How could you ever guarantee that? Do you think progress toward utilitarian values increases with intelligence/wisdom?
How could you ever guarantee that? Do you think progress toward utilitarian values increases with intelligence/wisdom?