According to Yudkowsky, consequentialists should choose torture over dust specks, since less total harm occurs.
I hope Yudkowsky was a little more careful with the wording. Consequentialists (who are also sadistic) would choose specks and consequentialists (who are also paperclipers) wouldn’t care either way unless one option wasted resources that could be used for paperclip manufacture.
(Incidentally, we can tell this isn’t Eliezer’s wording since his declared usage of ‘should’ is such that everything ‘should’ prefer torture to dust specks, whether he, she, or it is a deontologist, consequentialist or UFAI that is indifferent to humanity.)
I hope Yudkowsky was a little more careful with the wording. Consequentialists (who are also sadistic) would choose specks and consequentialists (who are also paperclipers) wouldn’t care either way unless one option wasted resources that could be used for paperclip manufacture.
Yudkowsky is well aware of that. I’d assume he thinks it true of human values, or at least rational human values, or …I’ll just update the essay to clarify that a bit.
‘Rational human values’ still looks unyudkowskian, though.
Edit: unless understood as ‘values of a rational human’ rather ‘rational values of a human’. I need to pay more attention to such double readings in English.
I hope Yudkowsky was a little more careful with the wording. Consequentialists (who are also sadistic) would choose specks and consequentialists (who are also paperclipers) wouldn’t care either way unless one option wasted resources that could be used for paperclip manufacture.
(Incidentally, we can tell this isn’t Eliezer’s wording since his declared usage of ‘should’ is such that everything ‘should’ prefer torture to dust specks, whether he, she, or it is a deontologist, consequentialist or UFAI that is indifferent to humanity.)
Yudkowsky is well aware of that. I’d assume he thinks it true of human values, or at least rational human values, or …I’ll just update the essay to clarify that a bit.
I don’t think he would admit rational as an attribute to values.
I meant rational insofar as avoiding scope insensitivity.
‘Rational human values’ still looks unyudkowskian, though.
Edit: unless understood as ‘values of a rational human’ rather ‘rational values of a human’. I need to pay more attention to such double readings in English.
I agree Yudkowsky probably wouldn’t word it that way. My apologies.
Certainly nothing to apologise for. I should apologise for my nitpickery.