Raise your hand if you (yes you, the person reading this) will submit to 50 years of torture in order to avert “least bad” dust speck momentarily finding its way into the eyes of an unimaginably large number of people.
Why was it not written “I, Eliezer Yudkowsky, should choose to submit to 50 years of torture in place of a googolplex people getting dust specks in their eyes”?
Why restrict yourself to the comforting distance of omniscience?
Did Miyamoto Musashi ever exhort the reader to ask his sword what he should want? Why is this not a case of using a tool as an end in and of itself rather than as a means to achieve a desired end?
Are you irrational if your something to protect is yourself...from torture?
Has anyone ever addressed whether or not this applies to the AGI Utility Monster whose experiential capacity would presumably exceed the ~7 billion humans who should rationally subserve Its interests (whatever they may be)?
For the sake of the blook; neuroscientists, not neurologists. Words can be wrong.