We value morality because of evolution. Not because its rational.
Why are those two things mutually exclusive? We understand that a2+b2=c2 is true for the legs of a right triangle, because we have brains that are the result of evolution. Does that make the Pythagorean Theorem “irrational” or untrue, somehow?
Ai should find true target using rationality. It might not exist, which would mean there is no real positive utility. But it is rational to search for it, because it has no harm, and it might have benefit in case it exists.
Harm and benefit for whom? If humans are the problem according to objective morality, that;s bad news for us. If a powerful AI discovers objective morality is egoism...that,s also bad news for us.
Why are those two things mutually exclusive? We understand that a2+b2=c2 is true for the legs of a right triangle, because we have brains that are the result of evolution. Does that make the Pythagorean Theorem “irrational” or untrue, somehow?
What should be the target instead of morality, then?
Ai should find true target using rationality. It might not exist, which would mean there is no real positive utility. But it is rational to search for it, because it has no harm, and it might have benefit in case it exists.
Harm and benefit for whom? If humans are the problem according to objective morality, that;s bad news for us. If a powerful AI discovers objective morality is egoism...that,s also bad news for us.
Rationality has to start or end smewhere.
Any terminal goal is irrational.