Also, I’m not at all confident that compromising with the Superhappies would be very bad, even before considering the probably larger benefit of them becoming more like us. I think I’d complain more about the abruptness and exogenousness of the change than the actual undesirability of the end state. As others have pointed out, though, a policy of compromise would lead to dilution of everyone’s values into oblivion, and so may be highly undesirable.
More generally and importantly, though, I wonder if the use of wireheading as a standard example of “the hidden complexity of wishes” and of FAI philosophical failure (and it is an excellent example) leads me and/or other Singularitarians to have too negative a reaction to, well, anything that sounds like wireheading, including eliminating pain.
Also, I’m not at all confident that compromising with the Superhappies would be very bad, even before considering the probably larger benefit of them becoming more like us. I think I’d complain more about the abruptness and exogenousness of the change than the actual undesirability of the end state. As others have pointed out, though, a policy of compromise would lead to dilution of everyone’s values into oblivion, and so may be highly undesirable.
More generally and importantly, though, I wonder if the use of wireheading as a standard example of “the hidden complexity of wishes” and of FAI philosophical failure (and it is an excellent example) leads me and/or other Singularitarians to have too negative a reaction to, well, anything that sounds like wireheading, including eliminating pain.